Warning: Permanently added '2620:52:6:1161:dead:beef:cafe:c122' (ED25519) to the list of known hosts. You can reproduce this build on your computer by running: sudo dnf install copr-rpmbuild /usr/bin/copr-rpmbuild --verbose --drop-resultdir --task-url https://copr.fedorainfracloud.org/backend/get-build-task/10217947-fedora-rawhide-ppc64le --chroot fedora-rawhide-ppc64le Version: 1.6 PID: 5539 Logging PID: 5541 Task: {'allow_user_ssh': False, 'appstream': False, 'background': False, 'build_id': 10217947, 'buildroot_pkgs': [], 'chroot': 'fedora-rawhide-ppc64le', 'enable_net': False, 'fedora_review': False, 'git_hash': 'ca8ee5cfda74dfe513fdd9e1f31aac1c6ef1a930', 'git_repo': 'https://copr-dist-git.fedorainfracloud.org/git/r0x0d/python-pydocket/python-pydocket', 'isolation': 'default', 'memory_reqs': 2048, 'package_name': 'python-pydocket', 'package_version': '0.17.9-2', 'project_dirname': 'python-pydocket', 'project_name': 'python-pydocket', 'project_owner': 'r0x0d', 'repo_priority': None, 'repos': [{'baseurl': 'https://download.copr.fedorainfracloud.org/results/r0x0d/python-pydocket/fedora-rawhide-ppc64le/', 'id': 'copr_base', 'name': 'Copr repository', 'priority': None}], 'sandbox': 'r0x0d/python-pydocket--r0x0d', 'source_json': {}, 'source_type': None, 'ssh_public_keys': None, 'storage': 0, 'submitter': 'r0x0d', 'tags': [], 'task_id': '10217947-fedora-rawhide-ppc64le', 'timeout': 18000, 'uses_devel_repo': False, 'with_opts': [], 'without_opts': []} Running: git clone https://copr-dist-git.fedorainfracloud.org/git/r0x0d/python-pydocket/python-pydocket /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket --depth 500 --no-single-branch --recursive cmd: ['git', 'clone', 'https://copr-dist-git.fedorainfracloud.org/git/r0x0d/python-pydocket/python-pydocket', '/var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket', '--depth', '500', '--no-single-branch', '--recursive'] cwd: . rc: 0 stdout: stderr: Cloning into '/var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket'... Running: git checkout ca8ee5cfda74dfe513fdd9e1f31aac1c6ef1a930 -- cmd: ['git', 'checkout', 'ca8ee5cfda74dfe513fdd9e1f31aac1c6ef1a930', '--'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket rc: 0 stdout: stderr: Note: switching to 'ca8ee5cfda74dfe513fdd9e1f31aac1c6ef1a930'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by switching back to a branch. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -c with the switch command. Example: git switch -c Or undo this operation with: git switch - Turn off this advice by setting config variable advice.detachedHead to false HEAD is now at ca8ee5c automatic import of python-pydocket Running: dist-git-client sources cmd: ['dist-git-client', 'sources'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket rc: 0 stdout: stderr: INFO: Reading stdout from command: git rev-parse --abbrev-ref HEAD INFO: Reading stdout from command: git rev-parse HEAD INFO: Reading sources specification file: sources INFO: Downloading pydocket-0.17.9.tar.gz INFO: Reading stdout from command: curl --help all INFO: Calling: curl -H Pragma: -H 'Accept-Encoding: identity' -o pydocket-0.17.9.tar.gz --location --connect-timeout 60 --retry 3 --retry-delay 10 --remote-time --show-error --fail --retry-all-errors https://copr-dist-git.fedorainfracloud.org/repo/pkgs/r0x0d/python-pydocket/python-pydocket/pydocket-0.17.9.tar.gz/md5/383526fbd90631115d39d7ec1511cf29/pydocket-0.17.9.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 340k 100 340k 0 0 3225k 0 --:--:-- --:--:-- --:--:-- 3241k INFO: Reading stdout from command: md5sum pydocket-0.17.9.tar.gz tail: /var/lib/copr-rpmbuild/main.log: file truncated Running (timeout=18000): unbuffer mock --spec /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket/python-pydocket.spec --sources /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket --resultdir /var/lib/copr-rpmbuild/results --uniqueext 1773335321.895804 -r /var/lib/copr-rpmbuild/results/configs/child.cfg INFO: mock.py version 6.7 starting (python version = 3.14.2, NVR = mock-6.7-1.fc43), args: /usr/libexec/mock/mock --spec /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket/python-pydocket.spec --sources /var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket --resultdir /var/lib/copr-rpmbuild/results --uniqueext 1773335321.895804 -r /var/lib/copr-rpmbuild/results/configs/child.cfg Start(bootstrap): init plugins INFO: tmpfs initialized INFO: selinux enabled INFO: chroot_scan: initialized INFO: compress_logs: initialized Finish(bootstrap): init plugins Start: init plugins INFO: tmpfs initialized INFO: selinux enabled INFO: chroot_scan: initialized INFO: compress_logs: initialized Finish: init plugins INFO: Signal handler active Start: run INFO: Start(/var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket/python-pydocket.spec) Config(fedora-rawhide-ppc64le) Start: clean chroot Finish: clean chroot Mock Version: 6.7 INFO: Mock Version: 6.7 Start(bootstrap): chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-ppc64le-bootstrap-1773335321.895804/root. INFO: calling preinit hooks INFO: enabled root cache INFO: enabled package manager cache Start(bootstrap): cleaning package manager metadata Finish(bootstrap): cleaning package manager metadata INFO: Guessed host environment type: unknown INFO: Using container image: registry.fedoraproject.org/fedora:rawhide INFO: Pulling image: registry.fedoraproject.org/fedora:rawhide INFO: Tagging container image as mock-bootstrap-fd0fcaed-4bd4-451e-b81b-091adc3a18fb INFO: Checking that d52611da65f2aeba0a724a81812a6ed329a02337f5433aeb95f552eb6cd40c2d image matches host's architecture INFO: Copy content of container d52611da65f2aeba0a724a81812a6ed329a02337f5433aeb95f552eb6cd40c2d to /var/lib/mock/fedora-rawhide-ppc64le-bootstrap-1773335321.895804/root INFO: mounting d52611da65f2aeba0a724a81812a6ed329a02337f5433aeb95f552eb6cd40c2d with podman image mount INFO: image d52611da65f2aeba0a724a81812a6ed329a02337f5433aeb95f552eb6cd40c2d as /var/lib/containers/storage/overlay/18fe36f7315ca45d2db48694798739c5e82939c10a9ad3fcdfd3a0a98504d6b6/merged INFO: umounting image d52611da65f2aeba0a724a81812a6ed329a02337f5433aeb95f552eb6cd40c2d (/var/lib/containers/storage/overlay/18fe36f7315ca45d2db48694798739c5e82939c10a9ad3fcdfd3a0a98504d6b6/merged) with podman image umount INFO: Removing image mock-bootstrap-fd0fcaed-4bd4-451e-b81b-091adc3a18fb INFO: Package manager dnf5 detected and used (fallback) INFO: Not updating bootstrap chroot, bootstrap_image_ready=True Start(bootstrap): creating root cache Finish(bootstrap): creating root cache Finish(bootstrap): chroot init Start: chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-ppc64le-1773335321.895804/root. INFO: calling preinit hooks INFO: enabled root cache INFO: enabled package manager cache Start: cleaning package manager metadata Finish: cleaning package manager metadata INFO: enabled HW Info plugin INFO: Package manager dnf5 detected and used (direct choice) INFO: Buildroot is handled by package management downloaded with a bootstrap image: rpm-6.0.1-5.fc45.ppc64le rpm-sequoia-1.10.1-1.fc45.ppc64le dnf5-5.4.0.0-4.fc45.ppc64le dnf5-plugins-5.4.0.0-4.fc45.ppc64le Start: installing minimal buildroot with dnf5 Updating and loading repositories: Copr repository 100% | 1.3 KiB/s | 1.6 KiB | 00m01s fedora 100% | 10.6 MiB/s | 19.9 MiB | 00m02s Repositories loaded. Package Arch Version Repository Size Installing group/module packages: bash ppc64le 0:5.3.9-3.fc44 fedora 8.9 MiB bzip2 ppc64le 0:1.0.8-23.fc44 fedora 170.8 KiB coreutils ppc64le 0:9.10-3.fc45 fedora 9.4 MiB cpio ppc64le 0:2.15-9.fc44 fedora 1.2 MiB diffutils ppc64le 0:3.12-5.fc44 fedora 1.7 MiB fedora-release-common noarch 0:45-0.2 fedora 20.6 KiB findutils ppc64le 1:4.10.0-7.fc44 fedora 2.0 MiB gawk ppc64le 0:5.4.0-2.fc45 fedora 3.1 MiB glibc-minimal-langpack ppc64le 0:2.43.9000-4.fc45 fedora 0.0 B grep ppc64le 0:3.12-3.fc44 fedora 1.0 MiB gzip ppc64le 0:1.14-2.fc44 fedora 437.3 KiB info ppc64le 0:7.3-1.fc45 fedora 488.1 KiB patch ppc64le 0:2.8-4.fc44 fedora 262.3 KiB redhat-rpm-config noarch 0:344-1.fc45 fedora 183.7 KiB rpm-build ppc64le 0:6.0.1-5.fc45 fedora 662.4 KiB sed ppc64le 0:4.9-8.fc45 fedora 937.0 KiB shadow-utils ppc64le 2:4.19.3-1.fc45 fedora 4.9 MiB tar ppc64le 2:1.35-8.fc44 fedora 3.1 MiB unzip ppc64le 0:6.0-69.fc44 fedora 533.4 KiB util-linux ppc64le 0:2.41.3-12.fc44 fedora 6.9 MiB which ppc64le 0:2.23-4.fc44 fedora 123.2 KiB xz ppc64le 1:5.8.2-2.fc44 fedora 1.4 MiB Installing dependencies: R-srpm-macros noarch 0:1.3.5-1.fc45 fedora 3.5 KiB add-determinism ppc64le 0:0.7.2-4.fc45 fedora 2.3 MiB alternatives ppc64le 0:1.33-5.fc44 fedora 89.9 KiB ansible-srpm-macros noarch 0:1-20.1.fc44 fedora 35.7 KiB audit-libs ppc64le 0:4.1.3-1.fc44 fedora 550.2 KiB binutils ppc64le 0:2.46.50-2.fc45 fedora 33.6 MiB build-reproducibility-srpm-macros noarch 0:0.7.2-4.fc45 fedora 1.2 KiB bzip2-libs ppc64le 0:1.0.8-23.fc44 fedora 136.4 KiB ca-certificates noarch 0:2025.2.80_v9.0.304-6.fc45 fedora 2.7 MiB cmake-srpm-macros noarch 0:4.2.3-2.fc45 fedora 524.0 B coreutils-common ppc64le 0:9.10-3.fc45 fedora 10.7 MiB crypto-policies noarch 0:20251128-3.git19878fe.fc44 fedora 132.6 KiB curl ppc64le 0:8.19.0~rc3-1.fc45 fedora 527.4 KiB cyrus-sasl-lib ppc64le 0:2.1.28-35.fc44 fedora 2.9 MiB debugedit ppc64le 0:5.3-1.fc45 fedora 384.4 KiB dwz ppc64le 0:0.16-3.fc44 fedora 386.5 KiB ed ppc64le 0:1.22.5-2.fc45 fedora 221.6 KiB efi-srpm-macros noarch 0:6-6.fc44 fedora 40.2 KiB elfutils ppc64le 0:0.194-5.fc45 fedora 3.4 MiB elfutils-debuginfod-client ppc64le 0:0.194-5.fc45 fedora 143.3 KiB elfutils-default-yama-scope noarch 0:0.194-5.fc45 fedora 1.8 KiB elfutils-libelf ppc64le 0:0.194-5.fc45 fedora 1.2 MiB elfutils-libs ppc64le 0:0.194-5.fc45 fedora 874.3 KiB erlang-srpm-macros noarch 0:0.3.11-1.fc45 fedora 1.9 KiB fedora-gpg-keys noarch 0:45-0.1 fedora 133.4 KiB fedora-release noarch 0:45-0.2 fedora 0.0 B fedora-release-identity-basic noarch 0:45-0.2 fedora 664.0 B fedora-repos noarch 0:45-0.1 fedora 4.9 KiB fedora-repos-rawhide noarch 0:45-0.1 fedora 2.2 KiB file ppc64le 0:5.47-1.fc45 fedora 141.0 KiB file-libs ppc64le 0:5.47-1.fc45 fedora 12.3 MiB filesystem ppc64le 0:3.18-56.fc45 fedora 112.0 B filesystem-srpm-macros noarch 0:3.18-56.fc45 fedora 38.2 KiB fonts-srpm-macros noarch 1:5.0.0-3.fc45 fedora 55.8 KiB forge-srpm-macros noarch 0:0.4.0-4.fc44 fedora 38.9 KiB fpc-srpm-macros noarch 0:1.3-16.fc44 fedora 144.0 B gap-srpm-macros noarch 0:2-2.fc44 fedora 2.1 KiB gdb-minimal ppc64le 0:17.1-5.fc45 fedora 16.3 MiB gdbm-libs ppc64le 1:1.23-11.fc44 fedora 233.3 KiB ghc-srpm-macros noarch 0:1.10-1.fc44 fedora 792.0 B glibc ppc64le 0:2.43.9000-4.fc45 fedora 11.7 MiB glibc-common ppc64le 0:2.43.9000-4.fc45 fedora 1.5 MiB glibc-gconv-extra ppc64le 0:2.43.9000-4.fc45 fedora 18.5 MiB gmp ppc64le 1:6.3.0-5.fc44 fedora 786.1 KiB gnat-srpm-macros noarch 0:7-2.fc44 fedora 1.0 KiB gnulib-l10n noarch 0:20241231-2.fc44 fedora 655.0 KiB gnupg2 ppc64le 0:2.4.9-5.fc44 fedora 6.8 MiB gnupg2-dirmngr ppc64le 0:2.4.9-5.fc44 fedora 837.8 KiB gnupg2-gpg-agent ppc64le 0:2.4.9-5.fc44 fedora 974.1 KiB gnupg2-gpgconf ppc64le 0:2.4.9-5.fc44 fedora 321.5 KiB gnupg2-keyboxd ppc64le 0:2.4.9-5.fc44 fedora 233.1 KiB gnupg2-verify ppc64le 0:2.4.9-5.fc44 fedora 428.1 KiB gnutls ppc64le 0:3.8.12-1.fc45 fedora 4.0 MiB go-srpm-macros noarch 0:3.8.0-2.fc44 fedora 61.9 KiB gpgverify noarch 0:2.2-4.fc44 fedora 8.7 KiB ima-evm-utils-libs ppc64le 0:1.6.2-8.fc44 fedora 92.5 KiB jansson ppc64le 0:2.14-4.fc44 fedora 156.9 KiB java-srpm-macros noarch 0:1-8.fc44 fedora 870.0 B json-c ppc64le 0:0.18-8.fc44 fedora 138.5 KiB kernel-srpm-macros noarch 0:1.0-28.fc44 fedora 1.9 KiB keyutils-libs ppc64le 0:1.6.3-7.fc44 fedora 97.9 KiB krb5-libs ppc64le 0:1.22.2-2.fc45 fedora 3.0 MiB libacl ppc64le 0:2.3.2-6.fc44 fedora 67.5 KiB libarchive ppc64le 0:3.8.6-1.fc45 fedora 1.3 MiB libassuan ppc64le 0:2.5.7-5.fc44 fedora 215.8 KiB libattr ppc64le 0:2.5.2-8.fc44 fedora 68.0 KiB libblkid ppc64le 0:2.41.3-12.fc44 fedora 354.4 KiB libbrotli ppc64le 0:1.2.0-3.fc44 fedora 1.0 MiB libcap ppc64le 0:2.77-2.fc44 fedora 506.7 KiB libcap-ng ppc64le 0:0.9.1-1.fc45 fedora 160.1 KiB libcbor ppc64le 0:0.13.0-2.fc44 fedora 139.5 KiB libcom_err ppc64le 0:1.47.3-4.fc44 fedora 110.9 KiB libcurl ppc64le 0:8.19.0~rc3-1.fc45 fedora 1.2 MiB libeconf ppc64le 0:0.7.9-3.fc44 fedora 80.7 KiB libevent ppc64le 0:2.1.12-17.fc44 fedora 1.3 MiB libfdisk ppc64le 0:2.41.3-12.fc44 fedora 482.9 KiB libffi ppc64le 0:3.5.2-2.fc44 fedora 347.5 KiB libfido2 ppc64le 0:1.16.0-5.fc44 fedora 342.5 KiB libfsverity ppc64le 0:1.7-1.fc45 fedora 68.0 KiB libgcc ppc64le 0:16.0.1-0.9.fc45 fedora 286.4 KiB libgcrypt ppc64le 0:1.12.1-1.fc45 fedora 1.6 MiB libgomp ppc64le 0:16.0.1-0.9.fc45 fedora 656.3 KiB libgpg-error ppc64le 0:1.59-1.fc45 fedora 1.0 MiB libidn2 ppc64le 0:2.3.8-3.fc44 fedora 560.1 KiB libksba ppc64le 0:1.6.8-1.fc45 fedora 529.8 KiB liblastlog2 ppc64le 0:2.41.3-12.fc44 fedora 137.2 KiB libmount ppc64le 0:2.41.3-12.fc44 fedora 548.2 KiB libnghttp2 ppc64le 0:1.68.0-3.fc44 fedora 197.8 KiB libnghttp3 ppc64le 0:1.15.0-1.fc44 fedora 206.5 KiB libpkgconf ppc64le 0:2.5.1-1.fc45 fedora 133.9 KiB libpsl ppc64le 0:0.21.5-7.fc44 fedora 132.0 KiB librtas ppc64le 0:2.0.6-6.fc44 fedora 305.3 KiB libselinux ppc64le 0:3.10-1.fc44 fedora 265.0 KiB libselinux-utils ppc64le 0:3.10-1.fc44 fedora 1.3 MiB libsemanage ppc64le 0:3.10-1.fc44 fedora 423.8 KiB libsepol ppc64le 0:3.10-1.fc44 fedora 1.0 MiB libsmartcols ppc64le 0:2.41.3-12.fc44 fedora 289.3 KiB libssh ppc64le 0:0.12.0-1.fc45 fedora 911.1 KiB libssh-config noarch 0:0.12.0-1.fc45 fedora 277.0 B libstdc++ ppc64le 0:16.0.1-0.9.fc45 fedora 3.9 MiB libtasn1 ppc64le 0:4.20.0-3.fc44 fedora 219.9 KiB libtool-ltdl ppc64le 0:2.5.4-10.fc44 fedora 93.8 KiB libunistring ppc64le 0:1.1-11.fc44 fedora 1.9 MiB libusb1 ppc64le 0:1.0.29-5.fc44 fedora 242.7 KiB libuuid ppc64le 0:2.41.3-12.fc44 fedora 69.2 KiB libverto ppc64le 0:0.3.2-12.fc44 fedora 69.0 KiB libxcrypt ppc64le 0:4.5.2-3.fc44 fedora 336.9 KiB libxml2 ppc64le 0:2.12.10-6.fc44 fedora 2.3 MiB libzstd ppc64le 0:1.5.7-5.fc44 fedora 1.1 MiB linkdupes ppc64le 0:0.7.2-4.fc45 fedora 905.6 KiB lua-libs ppc64le 0:5.5.0-1.fc45 fedora 393.9 KiB lua-srpm-macros noarch 0:1-17.fc44 fedora 1.3 KiB lz4-libs ppc64le 0:1.10.0-4.fc44 fedora 261.0 KiB mpfr ppc64le 0:4.2.2-3.fc44 fedora 913.5 KiB ncurses-base noarch 0:6.6-1.fc44 fedora 329.7 KiB ncurses-libs ppc64le 0:6.6-1.fc44 fedora 1.5 MiB nettle ppc64le 0:3.10.1-3.fc44 fedora 957.4 KiB ngtcp2 ppc64le 0:1.19.0-2.fc44 fedora 405.0 KiB ngtcp2-crypto-ossl ppc64le 0:1.19.0-2.fc44 fedora 67.1 KiB npth ppc64le 0:1.8-4.fc44 fedora 93.0 KiB ocaml-srpm-macros noarch 0:11-3.fc44 fedora 1.9 KiB openblas-srpm-macros noarch 0:2-21.fc44 fedora 112.0 B openldap ppc64le 0:2.6.10-7.fc44 fedora 889.6 KiB openssl-libs ppc64le 1:3.5.5-1.fc44 fedora 9.1 MiB p11-kit ppc64le 0:0.26.2-1.fc45 fedora 3.2 MiB p11-kit-trust ppc64le 0:0.26.2-1.fc45 fedora 593.9 KiB package-notes-srpm-macros noarch 0:0.17-3.fc45 fedora 1.6 KiB pam-libs ppc64le 0:1.7.2-1.fc44 fedora 286.0 KiB pcre2 ppc64le 0:10.47-1.fc44.1 fedora 841.8 KiB pcre2-syntax noarch 0:10.47-1.fc44.1 fedora 281.9 KiB perl-srpm-macros noarch 0:1-61.fc44 fedora 861.0 B pkgconf ppc64le 0:2.5.1-1.fc45 fedora 116.6 KiB pkgconf-m4 noarch 0:2.5.1-1.fc45 fedora 14.3 KiB pkgconf-pkg-config ppc64le 0:2.5.1-1.fc45 fedora 990.0 B policycoreutils ppc64le 0:3.10-2.fc45 fedora 1.5 MiB popt ppc64le 0:1.19-10.fc44 fedora 208.6 KiB publicsuffix-list-dafsa noarch 0:20260116-1.fc44 fedora 70.4 KiB pyproject-srpm-macros noarch 0:1.18.7-1.fc45 fedora 1.9 KiB python-srpm-macros noarch 0:3.14-10.fc44 fedora 51.6 KiB qt5-srpm-macros noarch 0:5.15.18-2.fc44 fedora 500.0 B qt6-srpm-macros noarch 0:6.10.2-1.fc45 fedora 472.0 B readline ppc64le 0:8.3-4.fc44 fedora 627.4 KiB rpm ppc64le 0:6.0.1-5.fc45 fedora 4.2 MiB rpm-build-libs ppc64le 0:6.0.1-5.fc45 fedora 327.9 KiB rpm-libs ppc64le 0:6.0.1-5.fc45 fedora 1.2 MiB rpm-plugin-selinux ppc64le 0:6.0.1-5.fc45 fedora 67.8 KiB rpm-sequoia ppc64le 0:1.10.1-1.fc45 fedora 4.8 MiB rpm-sign-libs ppc64le 0:6.0.1-5.fc45 fedora 67.5 KiB rust-srpm-macros noarch 0:28.4-3.fc44 fedora 5.5 KiB selinux-policy noarch 0:43.1-1.fc45 fedora 32.0 KiB selinux-policy-targeted noarch 0:43.1-1.fc45 fedora 18.5 MiB setup noarch 0:2.15.0-28.fc44 fedora 724.9 KiB sqlite-libs ppc64le 0:3.51.2-1.fc44 fedora 1.9 MiB systemd-libs ppc64le 0:260~rc2-1.fc45 fedora 3.0 MiB systemd-standalone-sysusers ppc64le 0:260~rc2-1.fc45 fedora 976.2 KiB tpm2-tss ppc64le 0:4.1.3-9.fc44 fedora 2.4 MiB tree-sitter-srpm-macros noarch 0:0.4.2-2.fc44 fedora 8.3 KiB util-linux-core ppc64le 0:2.41.3-12.fc44 fedora 2.5 MiB xxhash-libs ppc64le 0:0.8.3-4.fc44 fedora 85.5 KiB xz-libs ppc64le 1:5.8.2-2.fc44 fedora 265.2 KiB zig-srpm-macros noarch 0:1-8.fc44 fedora 1.3 KiB zip ppc64le 0:3.0-45.fc44 fedora 889.1 KiB zlib-ng-compat ppc64le 0:2.3.3-5.fc45 fedora 197.3 KiB zstd ppc64le 0:1.5.7-5.fc44 fedora 562.0 KiB Installing groups: Buildsystem building group Transaction Summary: Installing: 186 packages Total size of inbound packages is 74 MiB. Need to download 74 MiB. After this operation, 281 MiB extra will be used (install 281 MiB, remove 0 B). [ 1/186] bzip2-0:1.0.8-23.fc44.ppc64le 100% | 1.1 MiB/s | 53.0 KiB | 00m00s [ 2/186] coreutils-0:9.10-3.fc45.ppc64 100% | 22.2 MiB/s | 1.2 MiB | 00m00s [ 3/186] bash-0:5.3.9-3.fc44.ppc64le 100% | 29.1 MiB/s | 1.9 MiB | 00m00s [ 4/186] diffutils-0:3.12-5.fc44.ppc64 100% | 28.3 MiB/s | 406.0 KiB | 00m00s [ 5/186] cpio-0:2.15-9.fc44.ppc64le 100% | 13.4 MiB/s | 300.9 KiB | 00m00s [ 6/186] fedora-release-common-0:45-0. 100% | 4.7 MiB/s | 23.9 KiB | 00m00s [ 7/186] findutils-1:4.10.0-7.fc44.ppc 100% | 95.5 MiB/s | 586.5 KiB | 00m00s [ 8/186] glibc-minimal-langpack-0:2.43 100% | 16.7 MiB/s | 85.3 KiB | 00m00s [ 9/186] grep-0:3.12-3.fc44.ppc64le 100% | 100.5 MiB/s | 308.8 KiB | 00m00s [ 10/186] gzip-0:1.14-2.fc44.ppc64le 100% | 34.9 MiB/s | 178.8 KiB | 00m00s [ 11/186] info-0:7.3-1.fc45.ppc64le 100% | 67.3 MiB/s | 206.9 KiB | 00m00s [ 12/186] patch-0:2.8-4.fc44.ppc64le 100% | 40.2 MiB/s | 123.6 KiB | 00m00s [ 13/186] redhat-rpm-config-0:344-1.fc4 100% | 38.3 MiB/s | 78.4 KiB | 00m00s [ 14/186] rpm-build-0:6.0.1-5.fc45.ppc6 100% | 78.6 MiB/s | 161.0 KiB | 00m00s [ 15/186] sed-0:4.9-8.fc45.ppc64le 100% | 35.0 MiB/s | 322.7 KiB | 00m00s [ 16/186] gawk-0:5.4.0-2.fc45.ppc64le 100% | 37.4 MiB/s | 1.2 MiB | 00m00s [ 17/186] unzip-0:6.0-69.fc44.ppc64le 100% | 40.1 MiB/s | 205.2 KiB | 00m00s [ 18/186] tar-2:1.35-8.fc44.ppc64le 100% | 51.0 MiB/s | 887.0 KiB | 00m00s [ 19/186] shadow-utils-2:4.19.3-1.fc45. 100% | 42.3 MiB/s | 1.3 MiB | 00m00s [ 20/186] which-0:2.23-4.fc44.ppc64le 100% | 6.0 MiB/s | 43.1 KiB | 00m00s [ 21/186] util-linux-0:2.41.3-12.fc44.p 100% | 55.0 MiB/s | 1.3 MiB | 00m00s [ 22/186] xz-1:5.8.2-2.fc44.ppc64le 100% | 35.1 MiB/s | 610.3 KiB | 00m00s [ 23/186] filesystem-0:3.18-56.fc45.ppc 100% | 63.2 MiB/s | 1.8 MiB | 00m00s [ 24/186] ncurses-libs-0:6.6-1.fc44.ppc 100% | 24.9 MiB/s | 382.0 KiB | 00m00s [ 25/186] bzip2-libs-0:1.0.8-23.fc44.pp 100% | 16.1 MiB/s | 49.4 KiB | 00m00s [ 26/186] gmp-1:6.3.0-5.fc44.ppc64le 100% | 78.1 MiB/s | 319.9 KiB | 00m00s [ 27/186] libacl-0:2.3.2-6.fc44.ppc64le 100% | 12.9 MiB/s | 26.4 KiB | 00m00s [ 28/186] libattr-0:2.5.2-8.fc44.ppc64l 100% | 18.2 MiB/s | 18.7 KiB | 00m00s [ 29/186] libcap-0:2.77-2.fc44.ppc64le 100% | 46.0 MiB/s | 94.2 KiB | 00m00s [ 30/186] glibc-0:2.43.9000-4.fc45.ppc6 100% | 64.1 MiB/s | 3.3 MiB | 00m00s [ 31/186] libselinux-0:3.10-1.fc44.ppc6 100% | 6.8 MiB/s | 111.7 KiB | 00m00s [ 32/186] coreutils-common-0:9.10-3.fc4 100% | 48.5 MiB/s | 2.1 MiB | 00m00s [ 33/186] systemd-libs-0:260~rc2-1.fc45 100% | 41.8 MiB/s | 899.9 KiB | 00m00s [ 34/186] fedora-repos-0:45-0.1.noarch 100% | 1.3 MiB/s | 9.2 KiB | 00m00s [ 35/186] mpfr-0:4.2.2-3.fc44.ppc64le 100% | 58.2 MiB/s | 357.9 KiB | 00m00s [ 36/186] readline-0:8.3-4.fc44.ppc64le 100% | 39.6 MiB/s | 243.1 KiB | 00m00s [ 37/186] openssl-libs-1:3.5.5-1.fc44.p 100% | 63.1 MiB/s | 2.8 MiB | 00m00s [ 38/186] pcre2-0:10.47-1.fc44.1.ppc64l 100% | 15.5 MiB/s | 285.6 KiB | 00m00s [ 39/186] glibc-common-0:2.43.9000-4.fc 100% | 18.8 MiB/s | 385.8 KiB | 00m00s [ 40/186] ed-0:1.22.5-2.fc45.ppc64le 100% | 17.0 MiB/s | 87.2 KiB | 00m00s [ 41/186] R-srpm-macros-0:1.3.5-1.fc45. 100% | 3.5 MiB/s | 10.8 KiB | 00m00s [ 42/186] ansible-srpm-macros-0:1-20.1. 100% | 9.8 MiB/s | 20.1 KiB | 00m00s [ 43/186] cmake-srpm-macros-0:4.2.3-2.f 100% | 5.1 MiB/s | 10.5 KiB | 00m00s [ 44/186] build-reproducibility-srpm-ma 100% | 4.3 MiB/s | 13.1 KiB | 00m00s [ 45/186] erlang-srpm-macros-0:0.3.11-1 100% | 4.9 MiB/s | 10.0 KiB | 00m00s [ 46/186] efi-srpm-macros-0:6-6.fc44.no 100% | 7.3 MiB/s | 22.6 KiB | 00m00s [ 47/186] dwz-0:0.16-3.fc44.ppc64le 100% | 35.6 MiB/s | 145.8 KiB | 00m00s [ 48/186] file-0:5.47-1.fc45.ppc64le 100% | 12.1 MiB/s | 49.7 KiB | 00m00s [ 49/186] filesystem-srpm-macros-0:3.18 100% | 6.6 MiB/s | 26.8 KiB | 00m00s [ 50/186] fonts-srpm-macros-1:5.0.0-3.f 100% | 8.9 MiB/s | 27.2 KiB | 00m00s [ 51/186] forge-srpm-macros-0:0.4.0-4.f 100% | 9.8 MiB/s | 20.0 KiB | 00m00s [ 52/186] gap-srpm-macros-0:2-2.fc44.no 100% | 4.5 MiB/s | 9.1 KiB | 00m00s [ 53/186] fpc-srpm-macros-0:1.3-16.fc44 100% | 2.6 MiB/s | 7.9 KiB | 00m00s [ 54/186] ghc-srpm-macros-0:1.10-1.fc44 100% | 4.3 MiB/s | 8.8 KiB | 00m00s [ 55/186] go-srpm-macros-0:3.8.0-2.fc44 100% | 13.8 MiB/s | 28.2 KiB | 00m00s [ 56/186] gnat-srpm-macros-0:7-2.fc44.n 100% | 2.8 MiB/s | 8.7 KiB | 00m00s [ 57/186] java-srpm-macros-0:1-8.fc44.n 100% | 4.0 MiB/s | 8.1 KiB | 00m00s [ 58/186] kernel-srpm-macros-0:1.0-28.f 100% | 2.9 MiB/s | 9.0 KiB | 00m00s [ 59/186] lua-srpm-macros-0:1-17.fc44.n 100% | 4.3 MiB/s | 8.9 KiB | 00m00s [ 60/186] ocaml-srpm-macros-0:11-3.fc44 100% | 4.5 MiB/s | 9.3 KiB | 00m00s [ 61/186] package-notes-srpm-macros-0:0 100% | 4.7 MiB/s | 9.7 KiB | 00m00s [ 62/186] openblas-srpm-macros-0:2-21.f 100% | 2.5 MiB/s | 7.8 KiB | 00m00s [ 63/186] perl-srpm-macros-0:1-61.fc44. 100% | 8.2 MiB/s | 8.4 KiB | 00m00s [ 64/186] python-srpm-macros-0:3.14-10. 100% | 23.4 MiB/s | 24.0 KiB | 00m00s [ 65/186] pyproject-srpm-macros-0:1.18. 100% | 6.4 MiB/s | 13.1 KiB | 00m00s [ 66/186] qt5-srpm-macros-0:5.15.18-2.f 100% | 8.5 MiB/s | 8.7 KiB | 00m00s [ 67/186] qt6-srpm-macros-0:6.10.2-1.fc 100% | 8.9 MiB/s | 9.1 KiB | 00m00s [ 68/186] tree-sitter-srpm-macros-0:0.4 100% | 13.2 MiB/s | 13.5 KiB | 00m00s [ 69/186] rpm-0:6.0.1-5.fc45.ppc64le 100% | 94.2 MiB/s | 578.7 KiB | 00m00s [ 70/186] rust-srpm-macros-0:28.4-3.fc4 100% | 2.2 MiB/s | 11.1 KiB | 00m00s [ 71/186] zig-srpm-macros-0:1-8.fc44.no 100% | 2.9 MiB/s | 8.8 KiB | 00m00s [ 72/186] zip-0:3.0-45.fc44.ppc64le 100% | 54.0 MiB/s | 276.6 KiB | 00m00s [ 73/186] debugedit-0:5.3-1.fc45.ppc64l 100% | 14.9 MiB/s | 91.7 KiB | 00m00s [ 74/186] elfutils-libelf-0:0.194-5.fc4 100% | 51.7 MiB/s | 211.8 KiB | 00m00s [ 75/186] elfutils-0:0.194-5.fc45.ppc64 100% | 69.8 MiB/s | 572.1 KiB | 00m00s [ 76/186] libgcc-0:16.0.1-0.9.fc45.ppc6 100% | 50.6 MiB/s | 103.7 KiB | 00m00s [ 77/186] libarchive-0:3.8.6-1.fc45.ppc 100% | 53.8 MiB/s | 496.2 KiB | 00m00s [ 78/186] pkgconf-pkg-config-0:2.5.1-1. 100% | 9.2 MiB/s | 9.4 KiB | 00m00s [ 79/186] popt-0:1.19-10.fc44.ppc64le 100% | 34.1 MiB/s | 69.9 KiB | 00m00s [ 80/186] libstdc++-0:16.0.1-0.9.fc45.p 100% | 79.3 MiB/s | 1.0 MiB | 00m00s [ 81/186] rpm-build-libs-0:6.0.1-5.fc45 100% | 18.7 MiB/s | 134.1 KiB | 00m00s [ 82/186] zstd-0:1.5.7-5.fc44.ppc64le 100% | 47.4 MiB/s | 194.3 KiB | 00m00s [ 83/186] rpm-libs-0:6.0.1-5.fc45.ppc64 100% | 54.2 MiB/s | 444.3 KiB | 00m00s [ 84/186] audit-libs-0:4.1.3-1.fc44.ppc 100% | 50.6 MiB/s | 155.5 KiB | 00m00s [ 85/186] libeconf-0:0.7.9-3.fc44.ppc64 100% | 19.6 MiB/s | 40.1 KiB | 00m00s [ 86/186] libsemanage-0:3.10-1.fc44.ppc 100% | 44.0 MiB/s | 135.1 KiB | 00m00s [ 87/186] libxcrypt-0:4.5.2-3.fc44.ppc6 100% | 45.5 MiB/s | 139.7 KiB | 00m00s [ 88/186] pam-libs-0:1.7.2-1.fc44.ppc64 100% | 61.5 MiB/s | 63.0 KiB | 00m00s [ 89/186] setup-0:2.15.0-28.fc44.noarch 100% | 51.2 MiB/s | 157.2 KiB | 00m00s [ 90/186] libblkid-0:2.41.3-12.fc44.ppc 100% | 46.5 MiB/s | 142.9 KiB | 00m00s [ 91/186] libcap-ng-0:0.9.1-1.fc45.ppc6 100% | 33.0 MiB/s | 33.8 KiB | 00m00s [ 92/186] libfdisk-0:2.41.3-12.fc44.ppc 100% | 57.6 MiB/s | 177.1 KiB | 00m00s [ 93/186] liblastlog2-0:2.41.3-12.fc44. 100% | 11.8 MiB/s | 24.2 KiB | 00m00s [ 94/186] librtas-0:2.0.6-6.fc44.ppc64l 100% | 40.6 MiB/s | 83.2 KiB | 00m00s [ 95/186] libmount-0:2.41.3-12.fc44.ppc 100% | 45.3 MiB/s | 185.7 KiB | 00m00s [ 96/186] binutils-0:2.46.50-2.fc45.ppc 100% | 71.9 MiB/s | 6.8 MiB | 00m00s [ 97/186] libuuid-0:2.41.3-12.fc44.ppc6 100% | 818.4 KiB/s | 27.8 KiB | 00m00s [ 98/186] libsmartcols-0:2.41.3-12.fc44 100% | 3.0 MiB/s | 110.6 KiB | 00m00s [ 99/186] zlib-ng-compat-0:2.3.3-5.fc45 100% | 29.2 MiB/s | 89.7 KiB | 00m00s [100/186] xz-libs-1:5.8.2-2.fc44.ppc64l 100% | 30.9 MiB/s | 126.8 KiB | 00m00s [101/186] util-linux-core-0:2.41.3-12.f 100% | 57.7 MiB/s | 591.3 KiB | 00m00s [102/186] ncurses-base-0:6.6-1.fc44.noa 100% | 17.2 MiB/s | 88.0 KiB | 00m00s [103/186] gnulib-l10n-0:20241231-2.fc44 100% | 29.4 MiB/s | 150.3 KiB | 00m00s [104/186] libsepol-0:3.10-1.fc44.ppc64l 100% | 62.1 MiB/s | 381.6 KiB | 00m00s [105/186] glibc-gconv-extra-0:2.43.9000 100% | 75.1 MiB/s | 1.7 MiB | 00m00s [106/186] crypto-policies-0:20251128-3. 100% | 9.6 MiB/s | 98.3 KiB | 00m00s [107/186] fedora-repos-rawhide-0:45-0.1 100% | 8.6 MiB/s | 8.8 KiB | 00m00s [108/186] fedora-gpg-keys-0:45-0.1.noar 100% | 34.5 MiB/s | 141.3 KiB | 00m00s [109/186] ca-certificates-0:2025.2.80_v 100% | 45.2 MiB/s | 972.8 KiB | 00m00s [110/186] pcre2-syntax-0:10.47-1.fc44.1 100% | 23.0 MiB/s | 164.7 KiB | 00m00s [111/186] add-determinism-0:0.7.2-4.fc4 100% | 61.9 MiB/s | 887.8 KiB | 00m00s [112/186] linkdupes-0:0.7.2-4.fc45.ppc6 100% | 33.3 MiB/s | 374.9 KiB | 00m00s [113/186] curl-0:8.19.0~rc3-1.fc45.ppc6 100% | 60.5 MiB/s | 247.6 KiB | 00m00s [114/186] alternatives-0:1.33-5.fc44.pp 100% | 14.0 MiB/s | 43.2 KiB | 00m00s [115/186] file-libs-0:5.47-1.fc45.ppc64 100% | 50.6 MiB/s | 881.0 KiB | 00m00s [116/186] elfutils-debuginfod-client-0: 100% | 9.6 MiB/s | 49.3 KiB | 00m00s [117/186] jansson-0:2.14-4.fc44.ppc64le 100% | 9.9 MiB/s | 50.6 KiB | 00m00s [118/186] elfutils-libs-0:0.194-5.fc45. 100% | 75.2 MiB/s | 308.0 KiB | 00m00s [119/186] libzstd-0:1.5.7-5.fc44.ppc64l 100% | 55.7 MiB/s | 399.6 KiB | 00m00s [120/186] lz4-libs-0:1.10.0-4.fc44.ppc6 100% | 24.8 MiB/s | 101.7 KiB | 00m00s [121/186] libxml2-0:2.12.10-6.fc44.ppc6 100% | 58.5 MiB/s | 778.2 KiB | 00m00s [122/186] pkgconf-0:2.5.1-1.fc45.ppc64l 100% | 9.6 MiB/s | 49.1 KiB | 00m00s [123/186] pkgconf-m4-0:2.5.1-1.fc45.noa 100% | 2.7 MiB/s | 13.8 KiB | 00m00s [124/186] rpm-sign-libs-0:6.0.1-5.fc45. 100% | 13.8 MiB/s | 28.2 KiB | 00m00s [125/186] libgomp-0:16.0.1-0.9.fc45.ppc 100% | 77.7 MiB/s | 397.6 KiB | 00m00s [126/186] lua-libs-0:5.5.0-1.fc45.ppc64 100% | 25.6 MiB/s | 157.2 KiB | 00m00s [127/186] libffi-0:3.5.2-2.fc44.ppc64le 100% | 13.7 MiB/s | 42.0 KiB | 00m00s [128/186] rpm-sequoia-0:1.10.1-1.fc45.p 100% | 84.6 MiB/s | 1.5 MiB | 00m00s [129/186] sqlite-libs-0:3.51.2-1.fc44.p 100% | 45.2 MiB/s | 880.1 KiB | 00m00s [130/186] p11-kit-0:0.26.2-1.fc45.ppc64 100% | 29.8 MiB/s | 548.6 KiB | 00m00s [131/186] json-c-0:0.18-8.fc44.ppc64le 100% | 12.2 MiB/s | 49.8 KiB | 00m00s [132/186] p11-kit-trust-0:0.26.2-1.fc45 100% | 30.6 MiB/s | 156.4 KiB | 00m00s [133/186] libpkgconf-0:2.5.1-1.fc45.ppc 100% | 45.7 MiB/s | 46.7 KiB | 00m00s [134/186] elfutils-default-yama-scope-0 100% | 5.8 MiB/s | 11.9 KiB | 00m00s [135/186] libfsverity-0:1.7-1.fc45.ppc6 100% | 9.4 MiB/s | 19.4 KiB | 00m00s [136/186] ima-evm-utils-libs-0:1.6.2-8. 100% | 9.9 MiB/s | 30.3 KiB | 00m00s [137/186] libtasn1-0:4.20.0-3.fc44.ppc6 100% | 39.3 MiB/s | 80.4 KiB | 00m00s [138/186] gpgverify-0:2.2-4.fc44.noarch 100% | 5.5 MiB/s | 11.2 KiB | 00m00s [139/186] gnupg2-dirmngr-0:2.4.9-5.fc44 100% | 76.4 MiB/s | 313.1 KiB | 00m00s [140/186] gnupg2-0:2.4.9-5.fc44.ppc64le 100% | 88.9 MiB/s | 1.7 MiB | 00m00s [141/186] gnupg2-gpgconf-0:2.4.9-5.fc44 100% | 13.6 MiB/s | 125.6 KiB | 00m00s [142/186] gnupg2-gpg-agent-0:2.4.9-5.fc 100% | 19.8 MiB/s | 304.2 KiB | 00m00s [143/186] gnupg2-keyboxd-0:2.4.9-5.fc44 100% | 33.5 MiB/s | 103.0 KiB | 00m00s [144/186] libassuan-0:2.5.7-5.fc44.ppc6 100% | 69.9 MiB/s | 71.5 KiB | 00m00s [145/186] gnupg2-verify-0:2.4.9-5.fc44. 100% | 36.0 MiB/s | 184.5 KiB | 00m00s [146/186] libgcrypt-0:1.12.1-1.fc45.ppc 100% | 95.8 MiB/s | 686.5 KiB | 00m00s [147/186] npth-0:1.8-4.fc44.ppc64le 100% | 4.9 MiB/s | 25.2 KiB | 00m00s [148/186] libgpg-error-0:1.59-1.fc45.pp 100% | 31.7 MiB/s | 259.5 KiB | 00m00s [149/186] tpm2-tss-0:4.1.3-9.fc44.ppc64 100% | 64.6 MiB/s | 396.8 KiB | 00m00s [150/186] libksba-0:1.6.8-1.fc45.ppc64l 100% | 34.7 MiB/s | 177.4 KiB | 00m00s [151/186] openldap-0:2.6.10-7.fc44.ppc6 100% | 56.4 MiB/s | 288.8 KiB | 00m00s [152/186] libusb1-0:1.0.29-5.fc44.ppc64 100% | 20.9 MiB/s | 85.7 KiB | 00m00s [153/186] gnutls-0:3.8.12-1.fc45.ppc64l 100% | 71.2 MiB/s | 1.4 MiB | 00m00s [154/186] libidn2-0:2.3.8-3.fc44.ppc64l 100% | 19.2 MiB/s | 176.9 KiB | 00m00s [155/186] libunistring-0:1.1-11.fc44.pp 100% | 43.2 MiB/s | 574.5 KiB | 00m00s [156/186] nettle-0:3.10.1-3.fc44.ppc64l 100% | 49.9 MiB/s | 460.3 KiB | 00m00s [157/186] libtool-ltdl-0:2.5.4-10.fc44. 100% | 38.6 MiB/s | 39.5 KiB | 00m00s [158/186] libevent-0:2.1.12-17.fc44.ppc 100% | 47.7 MiB/s | 292.9 KiB | 00m00s [159/186] cyrus-sasl-lib-0:2.1.28-35.fc 100% | 55.0 MiB/s | 901.5 KiB | 00m00s [160/186] gdbm-libs-1:1.23-11.fc44.ppc6 100% | 8.6 MiB/s | 62.0 KiB | 00m00s [161/186] fedora-release-0:45-0.2.noarc 100% | 2.1 MiB/s | 12.8 KiB | 00m00s [162/186] xxhash-libs-0:0.8.3-4.fc44.pp 100% | 18.5 MiB/s | 37.8 KiB | 00m00s [163/186] systemd-standalone-sysusers-0 100% | 59.8 MiB/s | 367.6 KiB | 00m00s [164/186] fedora-release-identity-basic 100% | 4.4 MiB/s | 13.5 KiB | 00m00s [165/186] libcurl-0:8.19.0~rc3-1.fc45.p 100% | 59.9 MiB/s | 490.7 KiB | 00m00s [166/186] krb5-libs-0:1.22.2-2.fc45.ppc 100% | 64.5 MiB/s | 858.7 KiB | 00m00s [167/186] libbrotli-0:1.2.0-3.fc44.ppc6 100% | 42.7 MiB/s | 393.6 KiB | 00m00s [168/186] libnghttp2-0:1.68.0-3.fc44.pp 100% | 20.2 MiB/s | 82.6 KiB | 00m00s [169/186] libnghttp3-0:1.15.0-1.fc44.pp 100% | 37.4 MiB/s | 76.5 KiB | 00m00s [170/186] libpsl-0:0.21.5-7.fc44.ppc64l 100% | 21.6 MiB/s | 66.4 KiB | 00m00s [171/186] ngtcp2-0:1.19.0-2.fc44.ppc64l 100% | 54.3 MiB/s | 166.8 KiB | 00m00s [172/186] libssh-0:0.12.0-1.fc45.ppc64l 100% | 51.4 MiB/s | 315.7 KiB | 00m00s [173/186] ngtcp2-crypto-ossl-0:1.19.0-2 100% | 13.0 MiB/s | 26.6 KiB | 00m00s [174/186] gdb-minimal-0:17.1-5.fc45.ppc 100% | 79.6 MiB/s | 4.9 MiB | 00m00s [175/186] keyutils-libs-0:1.6.3-7.fc44. 100% | 1.3 MiB/s | 32.4 KiB | 00m00s [176/186] libcom_err-0:1.47.3-4.fc44.pp 100% | 1.1 MiB/s | 27.3 KiB | 00m00s [177/186] libverto-0:0.3.2-12.fc44.ppc6 100% | 5.3 MiB/s | 21.8 KiB | 00m00s [178/186] publicsuffix-list-dafsa-0:202 100% | 14.7 MiB/s | 60.3 KiB | 00m00s [179/186] libfido2-0:1.16.0-5.fc44.ppc6 100% | 26.7 MiB/s | 109.4 KiB | 00m00s [180/186] libssh-config-0:0.12.0-1.fc45 100% | 9.2 MiB/s | 9.4 KiB | 00m00s [181/186] libcbor-0:0.13.0-2.fc44.ppc64 100% | 12.0 MiB/s | 36.8 KiB | 00m00s [182/186] policycoreutils-0:3.10-2.fc45 100% | 50.9 MiB/s | 260.8 KiB | 00m00s [183/186] selinux-policy-0:43.1-1.fc45. 100% | 22.9 MiB/s | 70.4 KiB | 00m00s [184/186] libselinux-utils-0:3.10-1.fc4 100% | 60.5 MiB/s | 123.9 KiB | 00m00s [185/186] rpm-plugin-selinux-0:6.0.1-5. 100% | 9.3 MiB/s | 19.1 KiB | 00m00s [186/186] selinux-policy-targeted-0:43. 100% | 124.0 MiB/s | 6.8 MiB | 00m00s -------------------------------------------------------------------------------- [186/186] Total 100% | 108.5 MiB/s | 73.9 MiB | 00m01s Running transaction Importing OpenPGP key 0xF577861E: UserID : "Fedora (45) " Fingerprint: 4F50A6114CD5C6976A7F1179655A4B02F577861E From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-45-primary The key was successfully imported. Importing OpenPGP key 0xF577861E: UserID : "Fedora (45) " Fingerprint: 4F50A6114CD5C6976A7F1179655A4B02F577861E From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-45-primary The key was successfully imported. Importing OpenPGP key 0x6D9F90A6: UserID : "Fedora (44) " Fingerprint: 36F612DCF27F7D1A48A835E4DBFCF71C6D9F90A6 From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-44-primary The key was successfully imported. Importing OpenPGP key 0x91211FCE: UserID : "Fedora (46) " Fingerprint: D924B10D3E810DABDD8B56B596E7E91491211FCE From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-46-primary The key was successfully imported. [ 1/188] Verify package files 100% | 158.0 B/s | 186.0 B | 00m01s [ 2/188] Prepare transaction 100% | 934.0 B/s | 186.0 B | 00m00s [ 3/188] Installing libgcc-0:16.0.1-0. 100% | 70.3 MiB/s | 288.1 KiB | 00m00s [ 4/188] Installing libssh-config-0:0. 100% | 796.9 KiB/s | 816.0 B | 00m00s [ 5/188] Installing publicsuffix-list- 100% | 69.4 MiB/s | 71.1 KiB | 00m00s [ 6/188] Installing fedora-release-ide 100% | 898.4 KiB/s | 920.0 B | 00m00s [ 7/188] Installing fedora-gpg-keys-0: 100% | 12.7 MiB/s | 182.1 KiB | 00m00s [ 8/188] Installing fedora-repos-rawhi 100% | 2.4 MiB/s | 2.4 KiB | 00m00s [ 9/188] Installing fedora-repos-0:45- 100% | 5.6 MiB/s | 5.7 KiB | 00m00s [ 10/188] Installing fedora-release-com 100% | 8.1 MiB/s | 24.9 KiB | 00m00s [ 11/188] Installing fedora-release-0:4 100% | 12.1 KiB/s | 124.0 B | 00m00s >>> Running sysusers scriptlet: setup-0:2.15.0-28.fc44.noarch >>> Finished sysusers scriptlet: setup-0:2.15.0-28.fc44.noarch >>> Scriptlet output: >>> Creating group 'adm' with GID 4. >>> Creating group 'audio' with GID 63. >>> Creating group 'cdrom' with GID 11. >>> Creating group 'clock' with GID 103. >>> Creating group 'dialout' with GID 18. >>> Creating group 'disk' with GID 6. >>> Creating group 'floppy' with GID 19. >>> Creating group 'ftp' with GID 50. >>> Creating group 'games' with GID 20. >>> Creating group 'input' with GID 104. >>> Creating group 'kmem' with GID 9. >>> Creating group 'kvm' with GID 36. >>> Creating group 'lock' with GID 54. >>> Creating group 'lp' with GID 7. >>> Creating group 'mail' with GID 12. >>> Creating group 'man' with GID 15. >>> Creating group 'mem' with GID 8. >>> Creating group 'nobody' with GID 65534. >>> Creating group 'render' with GID 105. >>> Creating group 'root' with GID 0. >>> Creating group 'sgx' with GID 106. >>> Creating group 'sys' with GID 3. >>> Creating group 'tape' with GID 33. >>> Creating group 'tty' with GID 5. >>> Creating group 'users' with GID 100. >>> Creating group 'utmp' with GID 22. >>> Creating group 'video' with GID 39. >>> Creating group 'wheel' with GID 10. >>> Creating user 'adm' (adm) with UID 3 and GID 4. >>> Creating group 'bin' with GID 1. >>> Creating user 'bin' (bin) with UID 1 and GID 1. >>> Creating group 'daemon' with GID 2. >>> Creating user 'daemon' (daemon) with UID 2 and GID 2. >>> Creating user 'ftp' (FTP User) with UID 14 and GID 50. >>> Creating user 'games' (games) with UID 12 and GID 100. >>> Creating user 'halt' (halt) with UID 7 and GID 0. >>> Creating user 'lp' (lp) with UID 4 and GID 7. >>> Creating user 'mail' (mail) with UID 8 and GID 12. >>> Creating user 'nobody' (Kernel Overflow User) with UID 65534 and GID 65534. >>> Creating user 'operator' (operator) with UID 11 and GID 0. >>> Creating user 'root' (Super User) with UID 0 and GID 0. >>> Creating user 'shutdown' (shutdown) with UID 6 and GID 0. >>> Creating user 'sync' (sync) with UID 5 and GID 0. >>> [ 12/188] Installing setup-0:2.15.0-28. 100% | 12.5 MiB/s | 730.6 KiB | 00m00s >>> [RPM] /etc/hosts created as /etc/hosts.rpmnew [ 13/188] Installing filesystem-0:3.18- 100% | 904.3 KiB/s | 289.4 KiB | 00m00s [ 14/188] Installing pkgconf-m4-0:2.5.1 100% | 14.4 MiB/s | 14.7 KiB | 00m00s [ 15/188] Installing pcre2-syntax-0:10. 100% | 69.4 MiB/s | 284.3 KiB | 00m00s [ 16/188] Installing gnulib-l10n-0:2024 100% | 53.9 MiB/s | 661.9 KiB | 00m00s [ 17/188] Installing coreutils-common-0 100% | 113.7 MiB/s | 10.7 MiB | 00m00s [ 18/188] Installing ncurses-base-0:6.6 100% | 21.7 MiB/s | 355.3 KiB | 00m00s [ 19/188] Installing bash-0:5.3.9-3.fc4 100% | 83.5 MiB/s | 8.9 MiB | 00m00s [ 20/188] Installing glibc-common-0:2.4 100% | 26.5 MiB/s | 1.5 MiB | 00m00s [ 21/188] Installing glibc-gconv-extra- 100% | 127.4 MiB/s | 18.6 MiB | 00m00s [ 22/188] Installing glibc-0:2.43.9000- 100% | 82.2 MiB/s | 11.8 MiB | 00m00s [ 23/188] Installing ncurses-libs-0:6.6 100% | 99.7 MiB/s | 1.5 MiB | 00m00s [ 24/188] Installing glibc-minimal-lang 100% | 121.1 KiB/s | 124.0 B | 00m00s [ 25/188] Installing zlib-ng-compat-0:2 100% | 64.5 MiB/s | 198.1 KiB | 00m00s [ 26/188] Installing bzip2-libs-0:1.0.8 100% | 67.2 MiB/s | 137.5 KiB | 00m00s [ 27/188] Installing libgpg-error-0:1.5 100% | 21.2 MiB/s | 1.0 MiB | 00m00s [ 28/188] Installing libstdc++-0:16.0.1 100% | 126.3 MiB/s | 3.9 MiB | 00m00s [ 29/188] Installing libassuan-0:2.5.7- 100% | 70.8 MiB/s | 217.6 KiB | 00m00s [ 30/188] Installing libgcrypt-0:1.12.1 100% | 105.0 MiB/s | 1.6 MiB | 00m00s [ 31/188] Installing readline-0:8.3-4.f 100% | 102.5 MiB/s | 629.5 KiB | 00m00s [ 32/188] Installing gmp-1:6.3.0-5.fc44 100% | 96.2 MiB/s | 788.3 KiB | 00m00s [ 33/188] Installing libuuid-0:2.41.3-1 100% | 34.3 MiB/s | 70.3 KiB | 00m00s [ 34/188] Installing xz-libs-1:5.8.2-2. 100% | 86.7 MiB/s | 266.3 KiB | 00m00s [ 35/188] Installing systemd-libs-0:260 100% | 121.1 MiB/s | 3.0 MiB | 00m00s [ 36/188] Installing popt-0:1.19-10.fc4 100% | 26.3 MiB/s | 215.2 KiB | 00m00s [ 37/188] Installing libzstd-0:1.5.7-5. 100% | 109.1 MiB/s | 1.1 MiB | 00m00s [ 38/188] Installing elfutils-libelf-0: 100% | 120.7 MiB/s | 1.2 MiB | 00m00s [ 39/188] Installing npth-0:1.8-4.fc44. 100% | 45.9 MiB/s | 94.1 KiB | 00m00s [ 40/188] Installing libblkid-0:2.41.3- 100% | 86.8 MiB/s | 355.5 KiB | 00m00s [ 41/188] Installing libsepol-0:3.10-1. 100% | 115.8 MiB/s | 1.0 MiB | 00m00s [ 42/188] Installing sqlite-libs-0:3.51 100% | 107.9 MiB/s | 1.9 MiB | 00m00s [ 43/188] Installing gnupg2-gpgconf-0:2 100% | 7.7 MiB/s | 323.6 KiB | 00m00s [ 44/188] Installing libattr-0:2.5.2-8. 100% | 33.7 MiB/s | 69.0 KiB | 00m00s [ 45/188] Installing libacl-0:2.3.2-6.f 100% | 66.7 MiB/s | 68.3 KiB | 00m00s [ 46/188] Installing pcre2-0:10.47-1.fc 100% | 117.6 MiB/s | 843.2 KiB | 00m00s [ 47/188] Installing libselinux-0:3.10- 100% | 65.0 MiB/s | 266.3 KiB | 00m00s [ 48/188] Installing grep-0:3.12-3.fc44 100% | 19.6 MiB/s | 1.0 MiB | 00m00s [ 49/188] Installing sed-0:4.9-8.fc45.p 100% | 18.1 MiB/s | 945.2 KiB | 00m00s [ 50/188] Installing findutils-1:4.10.0 100% | 35.3 MiB/s | 2.0 MiB | 00m00s [ 51/188] Installing libxcrypt-0:4.5.2- 100% | 66.3 MiB/s | 339.6 KiB | 00m00s [ 52/188] Installing libtasn1-0:4.20.0- 100% | 72.2 MiB/s | 221.7 KiB | 00m00s [ 53/188] Installing libunistring-0:1.1 100% | 117.0 MiB/s | 1.9 MiB | 00m00s [ 54/188] Installing libidn2-0:2.3.8-3. 100% | 30.7 MiB/s | 566.3 KiB | 00m00s [ 55/188] Installing crypto-policies-0: 100% | 10.3 MiB/s | 157.7 KiB | 00m00s [ 56/188] Installing xz-1:5.8.2-2.fc44. 100% | 24.5 MiB/s | 1.4 MiB | 00m00s [ 57/188] Installing libmount-0:2.41.3- 100% | 89.4 MiB/s | 549.3 KiB | 00m00s [ 58/188] Installing gnupg2-verify-0:2. 100% | 10.0 MiB/s | 429.5 KiB | 00m00s [ 59/188] Installing dwz-0:0.16-3.fc44. 100% | 9.0 MiB/s | 387.8 KiB | 00m00s [ 60/188] Installing mpfr-0:4.2.2-3.fc4 100% | 89.4 MiB/s | 915.1 KiB | 00m00s [ 61/188] Installing gawk-0:5.4.0-2.fc4 100% | 42.7 MiB/s | 3.2 MiB | 00m00s [ 62/188] Installing libksba-0:1.6.8-1. 100% | 57.8 MiB/s | 532.4 KiB | 00m00s [ 63/188] Installing unzip-0:6.0-69.fc4 100% | 9.0 MiB/s | 536.9 KiB | 00m00s [ 64/188] Installing file-libs-0:5.47-1 100% | 180.4 MiB/s | 12.3 MiB | 00m00s [ 65/188] Installing file-0:5.47-1.fc45 100% | 3.4 MiB/s | 142.5 KiB | 00m00s [ 66/188] Installing diffutils-0:3.12-5 100% | 29.1 MiB/s | 1.7 MiB | 00m00s [ 67/188] Installing libeconf-0:0.7.9-3 100% | 26.8 MiB/s | 82.4 KiB | 00m00s [ 68/188] Installing libcap-ng-0:0.9.1- 100% | 52.7 MiB/s | 162.0 KiB | 00m00s [ 69/188] Installing audit-libs-0:4.1.3 100% | 77.1 MiB/s | 552.7 KiB | 00m00s [ 70/188] Installing pam-libs-0:1.7.2-1 100% | 70.4 MiB/s | 288.4 KiB | 00m00s [ 71/188] Installing libcap-0:2.77-2.fc 100% | 11.1 MiB/s | 511.9 KiB | 00m00s [ 72/188] Installing libsemanage-0:3.10 100% | 69.3 MiB/s | 425.6 KiB | 00m00s [ 73/188] Installing libsmartcols-0:2.4 100% | 70.9 MiB/s | 290.4 KiB | 00m00s [ 74/188] Installing alternatives-0:1.3 100% | 2.2 MiB/s | 91.5 KiB | 00m00s [ 75/188] Installing lua-libs-0:5.5.0-1 100% | 77.2 MiB/s | 395.3 KiB | 00m00s [ 76/188] Installing libffi-0:3.5.2-2.f 100% | 85.2 MiB/s | 348.9 KiB | 00m00s [ 77/188] Installing p11-kit-0:0.26.2-1 100% | 44.7 MiB/s | 3.3 MiB | 00m00s [ 78/188] Installing p11-kit-trust-0:0. 100% | 11.9 MiB/s | 595.6 KiB | 00m00s [ 79/188] Installing json-c-0:0.18-8.fc 100% | 45.5 MiB/s | 139.7 KiB | 00m00s [ 80/188] Installing ngtcp2-0:1.19.0-2. 100% | 79.4 MiB/s | 406.6 KiB | 00m00s [ 81/188] Installing openssl-libs-1:3.5 100% | 128.3 MiB/s | 9.1 MiB | 00m00s [ 82/188] Installing coreutils-0:9.10-3 100% | 80.9 MiB/s | 9.5 MiB | 00m00s [ 83/188] Installing ca-certificates-0: 100% | 938.2 KiB/s | 2.5 MiB | 00m03s [ 84/188] Installing gzip-0:1.14-2.fc44 100% | 9.2 MiB/s | 442.9 KiB | 00m00s [ 85/188] Installing rpm-sequoia-0:1.10 100% | 128.8 MiB/s | 4.8 MiB | 00m00s [ 86/188] Installing libfsverity-0:1.7- 100% | 33.7 MiB/s | 69.0 KiB | 00m00s [ 87/188] Installing libevent-0:2.1.12- 100% | 115.6 MiB/s | 1.3 MiB | 00m00s [ 88/188] Installing systemd-standalone 100% | 20.7 MiB/s | 976.8 KiB | 00m00s [ 89/188] Installing rpm-libs-0:6.0.1-5 100% | 111.2 MiB/s | 1.2 MiB | 00m00s [ 90/188] Installing ngtcp2-crypto-ossl 100% | 33.2 MiB/s | 68.0 KiB | 00m00s [ 91/188] Installing util-linux-core-0: 100% | 36.0 MiB/s | 2.5 MiB | 00m00s [ 92/188] Installing zip-0:3.0-45.fc44. 100% | 19.0 MiB/s | 893.0 KiB | 00m00s [ 93/188] Installing gnupg2-keyboxd-0:2 100% | 16.3 MiB/s | 234.4 KiB | 00m00s [ 94/188] Installing libpsl-0:0.21.5-7. 100% | 43.3 MiB/s | 133.1 KiB | 00m00s [ 95/188] Installing tar-2:1.35-8.fc44. 100% | 46.7 MiB/s | 3.1 MiB | 00m00s [ 96/188] Installing linkdupes-0:0.7.2- 100% | 19.3 MiB/s | 907.1 KiB | 00m00s [ 97/188] Installing libselinux-utils-0 100% | 26.0 MiB/s | 1.4 MiB | 00m00s [ 98/188] Installing liblastlog2-0:2.41 100% | 17.0 MiB/s | 139.2 KiB | 00m00s [ 99/188] Installing libfdisk-0:2.41.3- 100% | 94.5 MiB/s | 484.0 KiB | 00m00s [100/188] Installing zstd-0:1.5.7-5.fc4 100% | 12.3 MiB/s | 565.6 KiB | 00m00s [101/188] Installing libusb1-0:1.0.29-5 100% | 19.9 MiB/s | 244.3 KiB | 00m00s >>> Running sysusers scriptlet: tpm2-tss-0:4.1.3-9.fc44.ppc64le >>> Finished sysusers scriptlet: tpm2-tss-0:4.1.3-9.fc44.ppc64le >>> Scriptlet output: >>> Creating group 'tss' with GID 59. >>> Creating user 'tss' (Account used for TPM access) with UID 59 and GID 59. >>> [102/188] Installing tpm2-tss-0:4.1.3-9 100% | 114.5 MiB/s | 2.4 MiB | 00m00s [103/188] Installing ima-evm-utils-libs 100% | 45.8 MiB/s | 93.8 KiB | 00m00s [104/188] Installing gnupg2-gpg-agent-0 100% | 18.7 MiB/s | 977.9 KiB | 00m00s [105/188] Installing libxml2-0:2.12.10- 100% | 39.8 MiB/s | 2.3 MiB | 00m00s [106/188] Installing nettle-0:3.10.1-3. 100% | 104.2 MiB/s | 960.5 KiB | 00m00s [107/188] Installing gnutls-0:3.8.12-1. 100% | 121.6 MiB/s | 4.0 MiB | 00m00s [108/188] Installing bzip2-0:1.0.8-23.f 100% | 4.2 MiB/s | 175.4 KiB | 00m00s [109/188] Installing add-determinism-0: 100% | 38.7 MiB/s | 2.3 MiB | 00m00s [110/188] Installing cpio-0:2.15-9.fc44 100% | 21.8 MiB/s | 1.2 MiB | 00m00s [111/188] Installing ed-0:1.22.5-2.fc45 100% | 5.3 MiB/s | 223.9 KiB | 00m00s [112/188] Installing patch-0:2.8-4.fc44 100% | 6.3 MiB/s | 263.8 KiB | 00m00s [113/188] Installing librtas-0:2.0.6-6. 100% | 21.4 MiB/s | 307.4 KiB | 00m00s [114/188] Installing util-linux-0:2.41. 100% | 58.6 MiB/s | 7.0 MiB | 00m00s [115/188] Installing policycoreutils-0: 100% | 20.4 MiB/s | 1.5 MiB | 00m00s [116/188] Installing selinux-policy-0:4 100% | 1.3 MiB/s | 33.6 KiB | 00m00s [117/188] Installing selinux-policy-tar 100% | 54.5 MiB/s | 14.8 MiB | 00m00s [118/188] Installing build-reproducibil 100% | 1.5 MiB/s | 1.5 KiB | 00m00s [119/188] Installing jansson-0:2.14-4.f 100% | 51.5 MiB/s | 158.3 KiB | 00m00s [120/188] Installing lz4-libs-0:1.10.0- 100% | 64.0 MiB/s | 262.1 KiB | 00m00s [121/188] Installing libarchive-0:3.8.6 100% | 97.5 MiB/s | 1.3 MiB | 00m00s [122/188] Installing libgomp-0:16.0.1-0 100% | 107.0 MiB/s | 657.7 KiB | 00m00s [123/188] Installing libpkgconf-0:2.5.1 100% | 44.0 MiB/s | 135.0 KiB | 00m00s [124/188] Installing pkgconf-0:2.5.1-1. 100% | 2.8 MiB/s | 119.1 KiB | 00m00s [125/188] Installing pkgconf-pkg-config 100% | 45.5 KiB/s | 1.8 KiB | 00m00s [126/188] Installing libtool-ltdl-0:2.5 100% | 30.9 MiB/s | 94.9 KiB | 00m00s [127/188] Installing gdbm-libs-1:1.23-1 100% | 57.4 MiB/s | 235.0 KiB | 00m00s [128/188] Installing cyrus-sasl-lib-0:2 100% | 46.3 MiB/s | 2.9 MiB | 00m00s [129/188] Installing openldap-0:2.6.10- 100% | 79.3 MiB/s | 893.4 KiB | 00m00s [130/188] Installing gnupg2-dirmngr-0:2 100% | 15.8 MiB/s | 840.5 KiB | 00m00s [131/188] Installing gnupg2-0:2.4.9-5.f 100% | 71.9 MiB/s | 6.8 MiB | 00m00s [132/188] Installing rpm-sign-libs-0:6. 100% | 33.4 MiB/s | 68.4 KiB | 00m00s [133/188] Installing gpgverify-0:2.2-4. 100% | 9.2 MiB/s | 9.4 KiB | 00m00s [134/188] Installing xxhash-libs-0:0.8. 100% | 42.4 MiB/s | 86.9 KiB | 00m00s [135/188] Installing libbrotli-0:1.2.0- 100% | 92.2 MiB/s | 1.0 MiB | 00m00s [136/188] Installing libnghttp2-0:1.68. 100% | 64.8 MiB/s | 198.9 KiB | 00m00s [137/188] Installing libnghttp3-0:1.15. 100% | 67.7 MiB/s | 207.9 KiB | 00m00s [138/188] Installing keyutils-libs-0:1. 100% | 48.5 MiB/s | 99.3 KiB | 00m00s [139/188] Installing libcom_err-0:1.47. 100% | 54.7 MiB/s | 112.0 KiB | 00m00s [140/188] Installing libverto-0:0.3.2-1 100% | 23.0 MiB/s | 70.8 KiB | 00m00s [141/188] Installing krb5-libs-0:1.22.2 100% | 106.1 MiB/s | 3.0 MiB | 00m00s [142/188] Installing libcbor-0:0.13.0-2 100% | 45.9 MiB/s | 140.9 KiB | 00m00s [143/188] Installing libfido2-0:1.16.0- 100% | 84.0 MiB/s | 344.0 KiB | 00m00s [144/188] Installing libssh-0:0.12.0-1. 100% | 99.1 MiB/s | 913.2 KiB | 00m00s [145/188] Installing libcurl-0:8.19.0~r 100% | 100.8 MiB/s | 1.2 MiB | 00m00s [146/188] Installing curl-0:8.19.0~rc3- 100% | 9.6 MiB/s | 530.0 KiB | 00m00s [147/188] Installing rpm-0:6.0.1-5.fc45 100% | 30.4 MiB/s | 2.8 MiB | 00m00s [148/188] Installing cmake-srpm-macros- 100% | 785.2 KiB/s | 804.0 B | 00m00s [149/188] Installing efi-srpm-macros-0: 100% | 20.1 MiB/s | 41.2 KiB | 00m00s [150/188] Installing java-srpm-macros-0 100% | 1.1 MiB/s | 1.1 KiB | 00m00s [151/188] Installing lua-srpm-macros-0: 100% | 1.9 MiB/s | 1.9 KiB | 00m00s [152/188] Installing tree-sitter-srpm-m 100% | 4.5 MiB/s | 9.3 KiB | 00m00s [153/188] Installing zig-srpm-macros-0: 100% | 1.8 MiB/s | 1.9 KiB | 00m00s [154/188] Installing filesystem-srpm-ma 100% | 19.0 MiB/s | 38.9 KiB | 00m00s [155/188] Installing elfutils-default-y 100% | 408.6 KiB/s | 2.0 KiB | 00m00s [156/188] Installing elfutils-libs-0:0. 100% | 95.1 MiB/s | 876.1 KiB | 00m00s [157/188] Installing elfutils-debuginfo 100% | 3.4 MiB/s | 145.6 KiB | 00m00s [158/188] Installing binutils-0:2.46.50 100% | 117.6 MiB/s | 33.6 MiB | 00m00s [159/188] Installing elfutils-0:0.194-5 100% | 53.1 MiB/s | 3.5 MiB | 00m00s [160/188] Installing gdb-minimal-0:17.1 100% | 102.5 MiB/s | 16.3 MiB | 00m00s [161/188] Installing debugedit-0:5.3-1. 100% | 8.8 MiB/s | 387.6 KiB | 00m00s [162/188] Installing rpm-build-libs-0:6 100% | 80.3 MiB/s | 328.7 KiB | 00m00s [163/188] Installing rust-srpm-macros-0 100% | 6.2 MiB/s | 6.4 KiB | 00m00s [164/188] Installing qt6-srpm-macros-0: 100% | 730.5 KiB/s | 748.0 B | 00m00s [165/188] Installing qt5-srpm-macros-0: 100% | 757.8 KiB/s | 776.0 B | 00m00s [166/188] Installing perl-srpm-macros-0 100% | 1.1 MiB/s | 1.1 KiB | 00m00s [167/188] Installing package-notes-srpm 100% | 2.0 MiB/s | 2.1 KiB | 00m00s [168/188] Installing openblas-srpm-macr 100% | 382.8 KiB/s | 392.0 B | 00m00s [169/188] Installing ocaml-srpm-macros- 100% | 2.1 MiB/s | 2.1 KiB | 00m00s [170/188] Installing kernel-srpm-macros 100% | 2.3 MiB/s | 2.3 KiB | 00m00s [171/188] Installing gnat-srpm-macros-0 100% | 1.2 MiB/s | 1.3 KiB | 00m00s [172/188] Installing ghc-srpm-macros-0: 100% | 1.0 MiB/s | 1.0 KiB | 00m00s [173/188] Installing gap-srpm-macros-0: 100% | 2.6 MiB/s | 2.7 KiB | 00m00s [174/188] Installing fpc-srpm-macros-0: 100% | 410.2 KiB/s | 420.0 B | 00m00s [175/188] Installing ansible-srpm-macro 100% | 17.7 MiB/s | 36.2 KiB | 00m00s [176/188] Installing redhat-rpm-config- 100% | 12.4 MiB/s | 189.9 KiB | 00m00s [177/188] Installing forge-srpm-macros- 100% | 19.7 MiB/s | 40.3 KiB | 00m00s [178/188] Installing rpm-build-0:6.0.1- 100% | 13.1 MiB/s | 671.6 KiB | 00m00s [179/188] Installing erlang-srpm-macros 100% | 2.4 MiB/s | 2.5 KiB | 00m00s [180/188] Installing pyproject-srpm-mac 100% | 2.4 MiB/s | 2.5 KiB | 00m00s [181/188] Installing fonts-srpm-macros- 100% | 27.8 MiB/s | 57.0 KiB | 00m00s [182/188] Installing go-srpm-macros-0:3 100% | 30.8 MiB/s | 63.0 KiB | 00m00s [183/188] Installing R-srpm-macros-0:1. 100% | 4.3 MiB/s | 4.4 KiB | 00m00s [184/188] Installing python-srpm-macros 100% | 25.9 MiB/s | 52.9 KiB | 00m00s [185/188] Installing rpm-plugin-selinux 100% | 33.6 MiB/s | 68.8 KiB | 00m00s [186/188] Installing which-0:2.23-4.fc4 100% | 2.7 MiB/s | 125.4 KiB | 00m00s [187/188] Installing shadow-utils-2:4.1 100% | 46.0 MiB/s | 4.9 MiB | 00m00s [188/188] Installing info-0:7.3-1.fc45. 100% | 40.0 KiB/s | 488.5 KiB | 00m12s Complete! Finish: installing minimal buildroot with dnf5 Start: creating root cache Finish: creating root cache Finish: chroot init INFO: Installed packages: INFO: R-srpm-macros-1.3.5-1.fc45.noarch add-determinism-0.7.2-4.fc45.ppc64le alternatives-1.33-5.fc44.ppc64le ansible-srpm-macros-1-20.1.fc44.noarch audit-libs-4.1.3-1.fc44.ppc64le bash-5.3.9-3.fc44.ppc64le binutils-2.46.50-2.fc45.ppc64le build-reproducibility-srpm-macros-0.7.2-4.fc45.noarch bzip2-1.0.8-23.fc44.ppc64le bzip2-libs-1.0.8-23.fc44.ppc64le ca-certificates-2025.2.80_v9.0.304-6.fc45.noarch cmake-srpm-macros-4.2.3-2.fc45.noarch coreutils-9.10-3.fc45.ppc64le coreutils-common-9.10-3.fc45.ppc64le cpio-2.15-9.fc44.ppc64le crypto-policies-20251128-3.git19878fe.fc44.noarch curl-8.19.0~rc3-1.fc45.ppc64le cyrus-sasl-lib-2.1.28-35.fc44.ppc64le debugedit-5.3-1.fc45.ppc64le diffutils-3.12-5.fc44.ppc64le dwz-0.16-3.fc44.ppc64le ed-1.22.5-2.fc45.ppc64le efi-srpm-macros-6-6.fc44.noarch elfutils-0.194-5.fc45.ppc64le elfutils-debuginfod-client-0.194-5.fc45.ppc64le elfutils-default-yama-scope-0.194-5.fc45.noarch elfutils-libelf-0.194-5.fc45.ppc64le elfutils-libs-0.194-5.fc45.ppc64le erlang-srpm-macros-0.3.11-1.fc45.noarch fedora-gpg-keys-45-0.1.noarch fedora-release-45-0.2.noarch fedora-release-common-45-0.2.noarch fedora-release-identity-basic-45-0.2.noarch fedora-repos-45-0.1.noarch fedora-repos-rawhide-45-0.1.noarch file-5.47-1.fc45.ppc64le file-libs-5.47-1.fc45.ppc64le filesystem-3.18-56.fc45.ppc64le filesystem-srpm-macros-3.18-56.fc45.noarch findutils-4.10.0-7.fc44.ppc64le fonts-srpm-macros-5.0.0-3.fc45.noarch forge-srpm-macros-0.4.0-4.fc44.noarch fpc-srpm-macros-1.3-16.fc44.noarch gap-srpm-macros-2-2.fc44.noarch gawk-5.4.0-2.fc45.ppc64le gdb-minimal-17.1-5.fc45.ppc64le gdbm-libs-1.23-11.fc44.ppc64le ghc-srpm-macros-1.10-1.fc44.noarch glibc-2.43.9000-4.fc45.ppc64le glibc-common-2.43.9000-4.fc45.ppc64le glibc-gconv-extra-2.43.9000-4.fc45.ppc64le glibc-minimal-langpack-2.43.9000-4.fc45.ppc64le gmp-6.3.0-5.fc44.ppc64le gnat-srpm-macros-7-2.fc44.noarch gnulib-l10n-20241231-2.fc44.noarch gnupg2-2.4.9-5.fc44.ppc64le gnupg2-dirmngr-2.4.9-5.fc44.ppc64le gnupg2-gpg-agent-2.4.9-5.fc44.ppc64le gnupg2-gpgconf-2.4.9-5.fc44.ppc64le gnupg2-keyboxd-2.4.9-5.fc44.ppc64le gnupg2-verify-2.4.9-5.fc44.ppc64le gnutls-3.8.12-1.fc45.ppc64le go-srpm-macros-3.8.0-2.fc44.noarch gpg-pubkey-36f612dcf27f7d1a48a835e4dbfcf71c6d9f90a6-6786af3b gpg-pubkey-4f50a6114cd5c6976a7f1179655a4b02f577861e-6888bc98 gpg-pubkey-d924b10d3e810dabdd8b56b596e7e91491211fce-697c9899 gpgverify-2.2-4.fc44.noarch grep-3.12-3.fc44.ppc64le gzip-1.14-2.fc44.ppc64le ima-evm-utils-libs-1.6.2-8.fc44.ppc64le info-7.3-1.fc45.ppc64le jansson-2.14-4.fc44.ppc64le java-srpm-macros-1-8.fc44.noarch json-c-0.18-8.fc44.ppc64le kernel-srpm-macros-1.0-28.fc44.noarch keyutils-libs-1.6.3-7.fc44.ppc64le krb5-libs-1.22.2-2.fc45.ppc64le libacl-2.3.2-6.fc44.ppc64le libarchive-3.8.6-1.fc45.ppc64le libassuan-2.5.7-5.fc44.ppc64le libattr-2.5.2-8.fc44.ppc64le libblkid-2.41.3-12.fc44.ppc64le libbrotli-1.2.0-3.fc44.ppc64le libcap-2.77-2.fc44.ppc64le libcap-ng-0.9.1-1.fc45.ppc64le libcbor-0.13.0-2.fc44.ppc64le libcom_err-1.47.3-4.fc44.ppc64le libcurl-8.19.0~rc3-1.fc45.ppc64le libeconf-0.7.9-3.fc44.ppc64le libevent-2.1.12-17.fc44.ppc64le libfdisk-2.41.3-12.fc44.ppc64le libffi-3.5.2-2.fc44.ppc64le libfido2-1.16.0-5.fc44.ppc64le libfsverity-1.7-1.fc45.ppc64le libgcc-16.0.1-0.9.fc45.ppc64le libgcrypt-1.12.1-1.fc45.ppc64le libgomp-16.0.1-0.9.fc45.ppc64le libgpg-error-1.59-1.fc45.ppc64le libidn2-2.3.8-3.fc44.ppc64le libksba-1.6.8-1.fc45.ppc64le liblastlog2-2.41.3-12.fc44.ppc64le libmount-2.41.3-12.fc44.ppc64le libnghttp2-1.68.0-3.fc44.ppc64le libnghttp3-1.15.0-1.fc44.ppc64le libpkgconf-2.5.1-1.fc45.ppc64le libpsl-0.21.5-7.fc44.ppc64le librtas-2.0.6-6.fc44.ppc64le libselinux-3.10-1.fc44.ppc64le libselinux-utils-3.10-1.fc44.ppc64le libsemanage-3.10-1.fc44.ppc64le libsepol-3.10-1.fc44.ppc64le libsmartcols-2.41.3-12.fc44.ppc64le libssh-0.12.0-1.fc45.ppc64le libssh-config-0.12.0-1.fc45.noarch libstdc++-16.0.1-0.9.fc45.ppc64le libtasn1-4.20.0-3.fc44.ppc64le libtool-ltdl-2.5.4-10.fc44.ppc64le libunistring-1.1-11.fc44.ppc64le libusb1-1.0.29-5.fc44.ppc64le libuuid-2.41.3-12.fc44.ppc64le libverto-0.3.2-12.fc44.ppc64le libxcrypt-4.5.2-3.fc44.ppc64le libxml2-2.12.10-6.fc44.ppc64le libzstd-1.5.7-5.fc44.ppc64le linkdupes-0.7.2-4.fc45.ppc64le lua-libs-5.5.0-1.fc45.ppc64le lua-srpm-macros-1-17.fc44.noarch lz4-libs-1.10.0-4.fc44.ppc64le mpfr-4.2.2-3.fc44.ppc64le ncurses-base-6.6-1.fc44.noarch ncurses-libs-6.6-1.fc44.ppc64le nettle-3.10.1-3.fc44.ppc64le ngtcp2-1.19.0-2.fc44.ppc64le ngtcp2-crypto-ossl-1.19.0-2.fc44.ppc64le npth-1.8-4.fc44.ppc64le ocaml-srpm-macros-11-3.fc44.noarch openblas-srpm-macros-2-21.fc44.noarch openldap-2.6.10-7.fc44.ppc64le openssl-libs-3.5.5-1.fc44.ppc64le p11-kit-0.26.2-1.fc45.ppc64le p11-kit-trust-0.26.2-1.fc45.ppc64le package-notes-srpm-macros-0.17-3.fc45.noarch pam-libs-1.7.2-1.fc44.ppc64le patch-2.8-4.fc44.ppc64le pcre2-10.47-1.fc44.1.ppc64le pcre2-syntax-10.47-1.fc44.1.noarch perl-srpm-macros-1-61.fc44.noarch pkgconf-2.5.1-1.fc45.ppc64le pkgconf-m4-2.5.1-1.fc45.noarch pkgconf-pkg-config-2.5.1-1.fc45.ppc64le policycoreutils-3.10-2.fc45.ppc64le popt-1.19-10.fc44.ppc64le publicsuffix-list-dafsa-20260116-1.fc44.noarch pyproject-srpm-macros-1.18.7-1.fc45.noarch python-srpm-macros-3.14-10.fc44.noarch qt5-srpm-macros-5.15.18-2.fc44.noarch qt6-srpm-macros-6.10.2-1.fc45.noarch readline-8.3-4.fc44.ppc64le redhat-rpm-config-344-1.fc45.noarch rpm-6.0.1-5.fc45.ppc64le rpm-build-6.0.1-5.fc45.ppc64le rpm-build-libs-6.0.1-5.fc45.ppc64le rpm-libs-6.0.1-5.fc45.ppc64le rpm-plugin-selinux-6.0.1-5.fc45.ppc64le rpm-sequoia-1.10.1-1.fc45.ppc64le rpm-sign-libs-6.0.1-5.fc45.ppc64le rust-srpm-macros-28.4-3.fc44.noarch sed-4.9-8.fc45.ppc64le selinux-policy-43.1-1.fc45.noarch selinux-policy-targeted-43.1-1.fc45.noarch setup-2.15.0-28.fc44.noarch shadow-utils-4.19.3-1.fc45.ppc64le sqlite-libs-3.51.2-1.fc44.ppc64le systemd-libs-260~rc2-1.fc45.ppc64le systemd-standalone-sysusers-260~rc2-1.fc45.ppc64le tar-1.35-8.fc44.ppc64le tpm2-tss-4.1.3-9.fc44.ppc64le tree-sitter-srpm-macros-0.4.2-2.fc44.noarch unzip-6.0-69.fc44.ppc64le util-linux-2.41.3-12.fc44.ppc64le util-linux-core-2.41.3-12.fc44.ppc64le which-2.23-4.fc44.ppc64le xxhash-libs-0.8.3-4.fc44.ppc64le xz-5.8.2-2.fc44.ppc64le xz-libs-5.8.2-2.fc44.ppc64le zig-srpm-macros-1-8.fc44.noarch zip-3.0-45.fc44.ppc64le zlib-ng-compat-2.3.3-5.fc45.ppc64le zstd-1.5.7-5.fc44.ppc64le Start: buildsrpm Start: rpmbuild -bs Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Wrote: /builddir/build/SRPMS/python-pydocket-0.17.9-2.fc45.src.rpm Finish: rpmbuild -bs INFO: chroot_scan: 1 files copied to /var/lib/copr-rpmbuild/results/chroot_scan INFO: /var/lib/mock/fedora-rawhide-ppc64le-1773335321.895804/root/var/log/dnf5.log INFO: chroot_scan: creating tarball /var/lib/copr-rpmbuild/results/chroot_scan.tar.gz /bin/tar: Removing leading `/' from member names Finish: buildsrpm INFO: Done(/var/lib/copr-rpmbuild/workspace/workdir-7exii33v/python-pydocket/python-pydocket.spec) Config(child) 0 minutes 45 seconds INFO: Results and/or logs in: /var/lib/copr-rpmbuild/results INFO: Cleaning up build root ('cleanup_on_success=True') Start: clean chroot INFO: unmounting tmpfs. Finish: clean chroot INFO: Start(/var/lib/copr-rpmbuild/results/python-pydocket-0.17.9-2.fc45.src.rpm) Config(fedora-rawhide-ppc64le) Start(bootstrap): chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-ppc64le-bootstrap-1773335321.895804/root. INFO: reusing tmpfs at /var/lib/mock/fedora-rawhide-ppc64le-bootstrap-1773335321.895804/root. INFO: calling preinit hooks INFO: enabled root cache INFO: enabled package manager cache Start(bootstrap): cleaning package manager metadata Finish(bootstrap): cleaning package manager metadata Finish(bootstrap): chroot init Start: chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-ppc64le-1773335321.895804/root. INFO: calling preinit hooks INFO: enabled root cache Start: unpacking root cache Finish: unpacking root cache INFO: enabled package manager cache Start: cleaning package manager metadata Finish: cleaning package manager metadata INFO: enabled HW Info plugin INFO: Buildroot is handled by package management downloaded with a bootstrap image: rpm-6.0.1-5.fc45.ppc64le rpm-sequoia-1.10.1-1.fc45.ppc64le dnf5-5.4.0.0-4.fc45.ppc64le dnf5-plugins-5.4.0.0-4.fc45.ppc64le Finish: chroot init Start: build phase for python-pydocket-0.17.9-2.fc45.src.rpm Start: build setup for python-pydocket-0.17.9-2.fc45.src.rpm Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Wrote: /builddir/build/SRPMS/python-pydocket-0.17.9-2.fc45.src.rpm Updating and loading repositories: fedora 100% | 11.5 KiB/s | 3.6 KiB | 00m00s Copr repository 100% | 5.4 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package Arch Version Repository Size Installing: compat-lua-libs ppc64le 0:5.1.5-31.fc44 fedora 561.9 KiB python3-devel ppc64le 0:3.14.3-1.fc45 fedora 1.9 MiB python3-docker noarch 0:7.1.0-10.fc44 fedora 1.1 MiB python3-fakeredis noarch 0:2.34.0-2.fc45 fedora 1.2 MiB python3-pytest noarch 0:8.4.2-3.fc45 fedora 22.5 MiB python3-pytest-asyncio noarch 0:1.1.0-3.fc44 fedora 131.1 KiB python3-pytest-xdist noarch 0:3.7.0-6.fc44 fedora 468.8 KiB tomcli noarch 0:0.10.1-4.fc44 fedora 150.3 KiB Installing dependencies: expat ppc64le 0:2.7.4-1.fc45 fedora 429.0 KiB mpdecimal ppc64le 0:4.0.1-3.fc44 fedora 281.1 KiB pyproject-rpm-macros noarch 0:1.18.7-1.fc45 fedora 115.5 KiB python-pip-wheel noarch 0:26.0.1-1.fc45 fedora 1.2 MiB python-rpm-macros noarch 0:3.14-10.fc44 fedora 27.6 KiB python3 ppc64le 0:3.14.3-1.fc45 fedora 84.6 KiB python3-charset-normalizer noarch 0:3.4.5-1.fc45 fedora 366.3 KiB python3-click noarch 1:8.3.1-1.fc45 fedora 1.2 MiB python3-execnet noarch 0:2.1.2-2.fc44 fedora 970.6 KiB python3-idna noarch 0:3.11-2.fc44 fedora 738.4 KiB python3-iniconfig noarch 0:2.3.0-2.fc44 fedora 49.8 KiB python3-libs ppc64le 0:3.14.3-1.fc45 fedora 46.7 MiB python3-packaging noarch 0:26.0-1.fc45 fedora 732.3 KiB python3-pluggy noarch 0:1.6.0-5.fc44 fedora 211.5 KiB python3-pygments noarch 0:2.19.1-9.fc44 fedora 11.3 MiB python3-redis noarch 0:5.2.1-8.fc44 fedora 2.5 MiB python3-requests noarch 0:2.32.5-5.fc45 fedora 476.9 KiB python3-rpm-generators noarch 0:14-15.fc45 fedora 81.7 KiB python3-rpm-macros noarch 0:3.14-10.fc44 fedora 6.5 KiB python3-sortedcontainers noarch 0:2.4.0-26.fc44 fedora 392.9 KiB python3-tomlkit noarch 0:0.13.2-7.fc44 fedora 493.9 KiB python3-urllib3 noarch 0:2.6.3-2.fc44 fedora 1.1 MiB tomcli+tomlkit noarch 0:0.10.1-4.fc44 fedora 0.0 B tzdata noarch 0:2025c-2.fc44 fedora 1.2 MiB Transaction Summary: Installing: 32 packages Total size of inbound packages is 21 MiB. Need to download 21 MiB. After this operation, 99 MiB extra will be used (install 99 MiB, remove 0 B). [ 1/32] python3-devel-0:3.14.3-1.fc45.p 100% | 14.7 MiB/s | 437.1 KiB | 00m00s [ 2/32] python3-docker-0:7.1.0-10.fc44. 100% | 9.4 MiB/s | 297.3 KiB | 00m00s [ 3/32] compat-lua-libs-0:5.1.5-31.fc44 100% | 5.1 MiB/s | 177.6 KiB | 00m00s [ 4/32] python3-fakeredis-0:2.34.0-2.fc 100% | 49.3 MiB/s | 353.1 KiB | 00m00s [ 5/32] python3-pytest-asyncio-0:1.1.0- 100% | 13.4 MiB/s | 41.3 KiB | 00m00s [ 6/32] python3-pytest-xdist-0:3.7.0-6. 100% | 38.2 MiB/s | 117.5 KiB | 00m00s [ 7/32] tomcli-0:0.10.1-4.fc44.noarch 100% | 36.6 MiB/s | 75.0 KiB | 00m00s [ 8/32] python3-0:3.14.3-1.fc45.ppc64le 100% | 27.6 MiB/s | 28.2 KiB | 00m00s [ 9/32] python3-requests-0:2.32.5-5.fc4 100% | 52.1 MiB/s | 160.0 KiB | 00m00s [10/32] python3-urllib3-0:2.6.3-2.fc44. 100% | 49.1 MiB/s | 301.9 KiB | 00m00s [11/32] python3-pytest-0:8.4.2-3.fc45.n 100% | 68.2 MiB/s | 2.3 MiB | 00m00s [12/32] python3-sortedcontainers-0:2.4. 100% | 31.2 MiB/s | 64.0 KiB | 00m00s [13/32] python3-redis-0:5.2.1-8.fc44.no 100% | 29.2 MiB/s | 597.5 KiB | 00m00s [14/32] python3-iniconfig-0:2.3.0-2.fc4 100% | 6.4 MiB/s | 26.3 KiB | 00m00s [15/32] python3-pluggy-0:1.6.0-5.fc44.n 100% | 30.1 MiB/s | 61.7 KiB | 00m00s [16/32] python3-packaging-0:26.0-1.fc45 100% | 45.3 MiB/s | 185.5 KiB | 00m00s [17/32] python3-execnet-0:2.1.2-2.fc44. 100% | 52.3 MiB/s | 267.9 KiB | 00m00s [18/32] python3-click-1:8.3.1-1.fc45.no 100% | 65.2 MiB/s | 266.9 KiB | 00m00s [19/32] expat-0:2.7.4-1.fc45.ppc64le 100% | 42.9 MiB/s | 131.9 KiB | 00m00s [20/32] mpdecimal-0:4.0.1-3.fc44.ppc64l 100% | 54.4 MiB/s | 111.4 KiB | 00m00s [21/32] python3-pygments-0:2.19.1-9.fc4 100% | 72.1 MiB/s | 2.7 MiB | 00m00s [22/32] python-pip-wheel-0:26.0.1-1.fc4 100% | 42.6 MiB/s | 1.1 MiB | 00m00s [23/32] tzdata-0:2025c-2.fc44.noarch 100% | 46.5 MiB/s | 714.2 KiB | 00m00s [24/32] python3-charset-normalizer-0:3. 100% | 19.6 MiB/s | 120.4 KiB | 00m00s [25/32] python3-idna-0:3.11-2.fc44.noar 100% | 40.6 MiB/s | 124.7 KiB | 00m00s [26/32] tomcli+tomlkit-0:0.10.1-4.fc44. 100% | 4.1 MiB/s | 8.4 KiB | 00m00s [27/32] python3-tomlkit-0:0.13.2-7.fc44 100% | 61.9 MiB/s | 126.7 KiB | 00m00s [28/32] pyproject-rpm-macros-0:1.18.7-1 100% | 21.7 MiB/s | 44.5 KiB | 00m00s [29/32] python-rpm-macros-0:3.14-10.fc4 100% | 19.3 MiB/s | 19.8 KiB | 00m00s [30/32] python3-rpm-generators-0:14-15. 100% | 27.8 MiB/s | 28.5 KiB | 00m00s [31/32] python3-rpm-macros-0:3.14-10.fc 100% | 12.1 MiB/s | 12.3 KiB | 00m00s [32/32] python3-libs-0:3.14.3-1.fc45.pp 100% | 66.5 MiB/s | 10.0 MiB | 00m00s -------------------------------------------------------------------------------- [32/32] Total 100% | 107.5 MiB/s | 21.0 MiB | 00m00s Running transaction [ 1/34] Verify package files 100% | 103.0 B/s | 32.0 B | 00m00s [ 2/34] Prepare transaction 100% | 231.0 B/s | 32.0 B | 00m00s [ 3/34] Installing python-rpm-macros-0: 100% | 13.9 MiB/s | 28.5 KiB | 00m00s [ 4/34] Installing python3-rpm-macros-0 100% | 6.6 MiB/s | 6.8 KiB | 00m00s [ 5/34] Installing pyproject-rpm-macros 100% | 9.6 MiB/s | 117.5 KiB | 00m00s [ 6/34] Installing tzdata-0:2025c-2.fc4 100% | 12.7 MiB/s | 1.5 MiB | 00m00s [ 7/34] Installing python-pip-wheel-0:2 100% | 174.7 MiB/s | 1.2 MiB | 00m00s [ 8/34] Installing mpdecimal-0:4.0.1-3. 100% | 69.0 MiB/s | 282.7 KiB | 00m00s [ 9/34] Installing expat-0:2.7.4-1.fc45 100% | 7.0 MiB/s | 431.1 KiB | 00m00s [10/34] Installing python3-libs-0:3.14. 100% | 105.0 MiB/s | 47.1 MiB | 00m00s [11/34] Installing python3-0:3.14.3-1.f 100% | 2.0 MiB/s | 86.4 KiB | 00m00s [12/34] Installing python3-packaging-0: 100% | 66.2 MiB/s | 745.4 KiB | 00m00s [13/34] Installing python3-idna-0:3.11- 100% | 80.8 MiB/s | 744.9 KiB | 00m00s [14/34] Installing python3-urllib3-0:2. 100% | 66.9 MiB/s | 1.1 MiB | 00m00s [15/34] Installing python3-rpm-generato 100% | 27.0 MiB/s | 82.9 KiB | 00m00s [16/34] Installing python3-redis-0:5.2. 100% | 80.8 MiB/s | 2.6 MiB | 00m00s [17/34] Installing python3-sortedcontai 100% | 77.6 MiB/s | 397.3 KiB | 00m00s [18/34] Installing python3-iniconfig-0: 100% | 17.6 MiB/s | 54.1 KiB | 00m00s [19/34] Installing python3-pluggy-0:1.6 100% | 21.3 MiB/s | 217.9 KiB | 00m00s [20/34] Installing python3-pygments-0:2 100% | 63.5 MiB/s | 11.5 MiB | 00m00s [21/34] Installing python3-pytest-0:8.4 100% | 100.8 MiB/s | 22.7 MiB | 00m00s [22/34] Installing python3-execnet-0:2. 100% | 60.6 MiB/s | 992.2 KiB | 00m00s [23/34] Installing python3-click-1:8.3. 100% | 94.5 MiB/s | 1.2 MiB | 00m00s [24/34] Installing python3-charset-norm 100% | 7.7 MiB/s | 376.4 KiB | 00m00s [25/34] Installing python3-requests-0:2 100% | 53.0 MiB/s | 488.9 KiB | 00m00s [26/34] Installing python3-tomlkit-0:0. 100% | 61.4 MiB/s | 503.2 KiB | 00m00s [27/34] Installing tomcli-0:0.10.1-4.fc 100% | 3.2 MiB/s | 162.9 KiB | 00m00s [28/34] Installing tomcli+tomlkit-0:0.1 100% | 60.5 KiB/s | 124.0 B | 00m00s [29/34] Installing python3-docker-0:7.1 100% | 50.6 MiB/s | 1.1 MiB | 00m00s [30/34] Installing python3-pytest-xdist 100% | 52.2 MiB/s | 481.3 KiB | 00m00s [31/34] Installing python3-pytest-async 100% | 33.0 MiB/s | 135.2 KiB | 00m00s [32/34] Installing python3-fakeredis-0: 100% | 54.9 MiB/s | 1.3 MiB | 00m00s [33/34] Installing python3-devel-0:3.14 100% | 26.7 MiB/s | 2.0 MiB | 00m00s [34/34] Installing compat-lua-libs-0:5. 100% | 8.0 MiB/s | 565.1 KiB | 00m00s Complete! Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Wrote: /builddir/build/SRPMS/python-pydocket-0.17.9-2.fc45.src.rpm Updating and loading repositories: fedora 100% | 11.7 KiB/s | 3.6 KiB | 00m00s Copr repository 100% | 5.4 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package "compat-lua-libs-5.1.5-31.fc44.ppc64le" is already installed. Package "python3-devel-3.14.3-1.fc45.ppc64le" is already installed. Package "python3-docker-7.1.0-10.fc44.noarch" is already installed. Package "python3-fakeredis-2.34.0-2.fc45.noarch" is already installed. Package "python3-pytest-8.4.2-3.fc45.noarch" is already installed. Package "python3-pytest-asyncio-1.1.0-3.fc44.noarch" is already installed. Package "python3-pytest-xdist-3.7.0-6.fc44.noarch" is already installed. Package "tomcli-0.10.1-4.fc44.noarch" is already installed. Nothing to do. Finish: build setup for python-pydocket-0.17.9-2.fc45.src.rpm Start: rpmbuild python-pydocket-0.17.9-2.fc45.src.rpm Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Executing(%mkbuilddir): /bin/sh -e /var/tmp/rpm-tmp.T7XRd8 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.wookt2 + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + rm -rf pydocket-0.17.9 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/pydocket-0.17.9.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd pydocket-0.17.9 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/remove-pytest-unrecognized-arguments.diff + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + tomcli set pyproject.toml arrays replace project.dependencies 'croniter>=([0-9]+)' 'croniter>=5' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.K4F8hs + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + cd pydocket-0.17.9 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir --output /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires Handling hatchling from build-system.requires Requirement not satisfied: hatchling Handling hatch-vcs from build-system.requires Requirement not satisfied: hatch-vcs Exiting dependency generation pass: build backend + cat /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires + rm -rfv '*.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-pydocket-0.17.9-2.fc45.buildreqs.nosrc.rpm INFO: Going to install missing dynamic buildrequires Updating and loading repositories: fedora 100% | 12.7 KiB/s | 3.6 KiB | 00m00s Copr repository 100% | 5.9 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package Arch Version Repository Size Installing: python3-hatch-vcs noarch 0:0.5.0-7.fc45 fedora 34.9 KiB python3-hatchling noarch 0:1.29.0-1.fc45 fedora 659.2 KiB python3-pip noarch 0:26.0.1-1.fc45 fedora 11.4 MiB Installing dependencies: python3-pathspec noarch 0:1.0.3-2.fc44 fedora 372.4 KiB python3-setuptools noarch 0:80.10.2-2.fc45 fedora 7.5 MiB python3-setuptools_scm noarch 0:9.2.2-5.fc44 fedora 502.5 KiB python3-trove-classifiers noarch 0:2026.1.14.14-2.fc44 fedora 111.9 KiB Transaction Summary: Installing: 7 packages Package "compat-lua-libs-5.1.5-31.fc44.ppc64le" is already installed. Package "pyproject-rpm-macros-1.18.7-1.fc45.noarch" is already installed. Package "python3-devel-3.14.3-1.fc45.ppc64le" is already installed. Package "python3-docker-7.1.0-10.fc44.noarch" is already installed. Package "python3-fakeredis-2.34.0-2.fc45.noarch" is already installed. Package "python3-pytest-8.4.2-3.fc45.noarch" is already installed. Package "python3-pytest-asyncio-1.1.0-3.fc44.noarch" is already installed. Package "python3-pytest-xdist-3.7.0-6.fc44.noarch" is already installed. Package "python3-packaging-26.0-1.fc45.noarch" is already installed. Package "tomcli-0.10.1-4.fc44.noarch" is already installed. Total size of inbound packages is 5 MiB. Need to download 5 MiB. After this operation, 21 MiB extra will be used (install 21 MiB, remove 0 B). [1/7] python3-hatch-vcs-0:0.5.0-7.fc45. 100% | 1.1 MiB/s | 27.8 KiB | 00m00s [2/7] python3-setuptools_scm-0:9.2.2-5. 100% | 25.9 MiB/s | 159.4 KiB | 00m00s [3/7] python3-pathspec-0:1.0.3-2.fc44.n 100% | 26.3 MiB/s | 107.8 KiB | 00m00s [4/7] python3-hatchling-0:1.29.0-1.fc45 100% | 5.9 MiB/s | 229.2 KiB | 00m00s [5/7] python3-trove-classifiers-0:2026. 100% | 16.7 MiB/s | 34.2 KiB | 00m00s [6/7] python3-pip-0:26.0.1-1.fc45.noarc 100% | 51.2 MiB/s | 2.7 MiB | 00m00s [7/7] python3-setuptools-0:80.10.2-2.fc 100% | 31.9 MiB/s | 1.8 MiB | 00m00s -------------------------------------------------------------------------------- [7/7] Total 100% | 52.2 MiB/s | 5.0 MiB | 00m00s Running transaction [1/9] Verify package files 100% | 97.0 B/s | 7.0 B | 00m00s [2/9] Prepare transaction 100% | 85.0 B/s | 7.0 B | 00m00s [3/9] Installing python3-setuptools-0:8 100% | 67.7 MiB/s | 7.7 MiB | 00m00s [4/9] Installing python3-setuptools_scm 100% | 9.3 MiB/s | 526.4 KiB | 00m00s [5/9] Installing python3-trove-classifi 100% | 2.5 MiB/s | 115.7 KiB | 00m00s [6/9] Installing python3-pathspec-0:1.0 100% | 27.5 MiB/s | 394.6 KiB | 00m00s [7/9] Installing python3-hatchling-0:1. 100% | 10.5 MiB/s | 708.4 KiB | 00m00s [8/9] Installing python3-hatch-vcs-0:0. 100% | 3.6 MiB/s | 41.0 KiB | 00m00s [9/9] Installing python3-pip-0:26.0.1-1 100% | 43.9 MiB/s | 11.7 MiB | 00m00s Complete! Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.NPepxx + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + cd pydocket-0.17.9 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir --output /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires Handling hatchling from build-system.requires Requirement satisfied: hatchling (installed: hatchling 1.29.0) Handling hatch-vcs from build-system.requires Requirement satisfied: hatch-vcs (installed: hatch-vcs 0.5.0) Handling cloudpickle>=3.1.1 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: cloudpickle>=3.1.1 Handling croniter>=5 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: croniter>=5 Handling exceptiongroup>=1.2.0; python_version < '3.11' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: exceptiongroup>=1.2.0; python_version < '3.11' Handling fakeredis[lua]>=2.32.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: fakeredis[lua]>=2.32.1 (installed: fakeredis 2.34.0) (extras are currently not checked) Handling opentelemetry-api>=1.33.0 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: opentelemetry-api>=1.33.0 Handling prometheus-client>=0.21.1 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: prometheus-client>=0.21.1 Handling py-key-value-aio[memory,redis]>=0.3.0 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: py-key-value-aio[memory,redis]>=0.3.0 Handling python-json-logger>=2.0.7 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: python-json-logger>=2.0.7 Handling redis>=5 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: redis>=5 (installed: redis 5.2.1) Handling rich>=13.9.4 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: rich>=13.9.4 Handling taskgroup>=0.2.2; python_version < '3.11' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: taskgroup>=0.2.2; python_version < '3.11' Handling typer>=0.15.1 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: typer>=0.15.1 Handling typing-extensions>=4.12.0 from hook generated metadata: Requires-Dist (pydocket) Requirement not satisfied: typing-extensions>=4.12.0 Handling tzdata>=2025.2; sys_platform == 'win32' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: tzdata>=2025.2; sys_platform == 'win32' Handling opentelemetry-sdk>=1.33.0; extra == 'metrics' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: opentelemetry-sdk>=1.33.0; extra == 'metrics' + cat /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires + rm -rfv pydocket-0.17.9.dist-info/ removed 'pydocket-0.17.9.dist-info/METADATA' removed directory 'pydocket-0.17.9.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-pydocket-0.17.9-2.fc45.buildreqs.nosrc.rpm INFO: Going to install missing dynamic buildrequires Updating and loading repositories: fedora 100% | 15.0 KiB/s | 3.6 KiB | 00m00s Copr repository 100% | 7.1 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package "compat-lua-libs-5.1.5-31.fc44.ppc64le" is already installed. Package "pyproject-rpm-macros-1.18.7-1.fc45.noarch" is already installed. Package "python3-devel-3.14.3-1.fc45.ppc64le" is already installed. Package "python3-docker-7.1.0-10.fc44.noarch" is already installed. Package "python3-fakeredis-2.34.0-2.fc45.noarch" is already installed. Package "python3-pytest-8.4.2-3.fc45.noarch" is already installed. Package "python3-pytest-asyncio-1.1.0-3.fc44.noarch" is already installed. Package "python3-pytest-xdist-3.7.0-6.fc44.noarch" is already installed. Package "python3-fakeredis-2.34.0-2.fc45.noarch" is already installed. Package "python3-hatch-vcs-0.5.0-7.fc45.noarch" is already installed. Package "python3-hatchling-1.29.0-1.fc45.noarch" is already installed. Package "python3-packaging-26.0-1.fc45.noarch" is already installed. Package "python3-pip-26.0.1-1.fc45.noarch" is already installed. Package "python3-redis-5.2.1-8.fc44.noarch" is already installed. Package "tomcli-0.10.1-4.fc44.noarch" is already installed. Package Arch Version Repository Size Installing: python3-cloudpickle noarch 0:3.1.2-2.fc44 fedora 129.7 KiB python3-croniter noarch 0:5.0.1-7.fc45 fedora 200.0 KiB python3-fakeredis+lua noarch 0:2.34.0-2.fc45 fedora 0.0 B python3-json-logger noarch 0:4.0.0-2.fc44 fedora 83.3 KiB python3-opentelemetry-api noarch 0:1.39.1-1.fc45 fedora 438.0 KiB python3-prometheus_client noarch 0:0.23.0-3.fc44 fedora 425.1 KiB python3-py-key-value-aio noarch 0:0.3.0-2.fc45 fedora 756.1 KiB python3-py-key-value-aio+memory noarch 0:0.3.0-2.fc45 fedora 0.0 B python3-py-key-value-aio+redis noarch 0:0.3.0-2.fc45 fedora 0.0 B python3-rich noarch 0:14.3.3-1.fc45 fedora 3.4 MiB python3-typer noarch 0:0.24.1-1.fc45 fedora 846.8 KiB python3-typing-extensions noarch 0:4.15.0-4.fc45 fedora 538.8 KiB Installing dependencies: lua5.4-libs ppc64le 0:5.5.0-1.fc45 fedora 393.6 KiB python3-annotated-doc noarch 0:0.0.4-3.fc44 fedora 16.7 KiB python3-beartype noarch 0:0.22.9-1.fc45 fedora 9.1 MiB python3-cachetools noarch 0:7.0.5-1.fc45 fedora 152.8 KiB python3-dateutil noarch 1:2.9.0.post0-7.fc44 fedora 878.0 KiB python3-decorator noarch 0:5.2.1-6.fc44 fedora 81.8 KiB python3-importlib-metadata noarch 0:8.7.1-2.fc44 fedora 180.4 KiB python3-lupa ppc64le 0:2.6-3.fc44 fedora 592.5 KiB python3-markdown-it-py noarch 0:4.0.0-1.fc45 fedora 540.6 KiB python3-mdurl noarch 0:0.1.2-14.fc44 fedora 44.0 KiB python3-py-key-value-shared noarch 0:0.3.0-2.fc45 fedora 129.8 KiB python3-pytz noarch 0:2026.1-1.fc45 fedora 224.0 KiB python3-shellingham noarch 0:1.5.4-14.fc44 fedora 43.6 KiB python3-six noarch 0:1.17.0-8.fc44 fedora 118.0 KiB python3-zipp noarch 0:3.23.0-3.fc44 fedora 60.7 KiB Transaction Summary: Installing: 27 packages Total size of inbound packages is 5 MiB. Need to download 5 MiB. After this operation, 19 MiB extra will be used (install 19 MiB, remove 0 B). [ 1/27] python3-cloudpickle-0:3.1.2-2.f 100% | 1.6 MiB/s | 48.2 KiB | 00m00s [ 2/27] python3-croniter-0:5.0.1-7.fc45 100% | 1.7 MiB/s | 52.8 KiB | 00m00s [ 3/27] python3-fakeredis+lua-0:2.34.0- 100% | 240.4 KiB/s | 7.7 KiB | 00m00s [ 4/27] python3-opentelemetry-api-0:1.3 100% | 26.8 MiB/s | 137.1 KiB | 00m00s [ 5/27] python3-prometheus_client-0:0.2 100% | 29.5 MiB/s | 151.1 KiB | 00m00s [ 6/27] python3-py-key-value-aio-0:0.3. 100% | 34.5 MiB/s | 247.6 KiB | 00m00s [ 7/27] python3-py-key-value-aio+memory 100% | 2.5 MiB/s | 7.5 KiB | 00m00s [ 8/27] python3-py-key-value-aio+redis- 100% | 3.7 MiB/s | 7.5 KiB | 00m00s [ 9/27] python3-json-logger-0:4.0.0-2.f 100% | 20.3 MiB/s | 41.6 KiB | 00m00s [10/27] python3-typer-0:0.24.1-1.fc45.n 100% | 33.8 MiB/s | 172.9 KiB | 00m00s [11/27] python3-typing-extensions-0:4.1 100% | 27.4 MiB/s | 112.3 KiB | 00m00s [12/27] python3-rich-0:14.3.3-1.fc45.no 100% | 64.9 MiB/s | 664.3 KiB | 00m00s [13/27] python3-pytz-0:2026.1-1.fc45.no 100% | 16.1 MiB/s | 65.8 KiB | 00m00s [14/27] python3-dateutil-1:2.9.0.post0- 100% | 48.0 MiB/s | 344.4 KiB | 00m00s [15/27] python3-importlib-metadata-0:8. 100% | 24.0 MiB/s | 73.7 KiB | 00m00s [16/27] python3-lupa-0:2.6-3.fc44.ppc64 100% | 32.3 MiB/s | 165.3 KiB | 00m00s [17/27] python3-decorator-0:5.2.1-6.fc4 100% | 15.6 MiB/s | 32.0 KiB | 00m00s [18/27] python3-py-key-value-shared-0:0 100% | 32.7 MiB/s | 67.0 KiB | 00m00s [19/27] python3-cachetools-0:7.0.5-1.fc 100% | 13.5 MiB/s | 55.3 KiB | 00m00s [20/27] python3-markdown-it-py-0:4.0.0- 100% | 50.0 MiB/s | 205.0 KiB | 00m00s [21/27] python3-annotated-doc-0:0.0.4-3 100% | 5.0 MiB/s | 15.2 KiB | 00m00s [22/27] python3-shellingham-0:1.5.4-14. 100% | 11.1 MiB/s | 34.1 KiB | 00m00s [23/27] python3-six-0:1.17.0-8.fc44.noa 100% | 10.3 MiB/s | 42.0 KiB | 00m00s [24/27] python3-beartype-0:0.22.9-1.fc4 100% | 73.9 MiB/s | 1.7 MiB | 00m00s [25/27] lua5.4-libs-0:5.5.0-1.fc45.ppc6 100% | 12.2 MiB/s | 149.5 KiB | 00m00s [26/27] python3-zipp-0:3.23.0-3.fc44.no 100% | 3.6 MiB/s | 37.3 KiB | 00m00s [27/27] python3-mdurl-0:0.1.2-14.fc44.n 100% | 10.2 MiB/s | 31.4 KiB | 00m00s -------------------------------------------------------------------------------- [27/27] Total 100% | 53.5 MiB/s | 4.6 MiB | 00m00s Running transaction [ 1/29] Verify package files 100% | 385.0 B/s | 27.0 B | 00m00s [ 2/29] Prepare transaction 100% | 250.0 B/s | 27.0 B | 00m00s [ 3/29] Installing python3-beartype-0:0 100% | 53.0 MiB/s | 9.4 MiB | 00m00s [ 4/29] Installing python3-typing-exten 100% | 105.7 MiB/s | 541.1 KiB | 00m00s [ 5/29] Installing python3-py-key-value 100% | 14.3 MiB/s | 146.7 KiB | 00m00s [ 6/29] Installing python3-py-key-value 100% | 21.7 MiB/s | 823.7 KiB | 00m00s [ 7/29] Installing python3-mdurl-0:0.1. 100% | 9.7 MiB/s | 49.4 KiB | 00m00s [ 8/29] Installing python3-markdown-it- 100% | 8.9 MiB/s | 584.7 KiB | 00m00s [ 9/29] Installing python3-rich-0:14.3. 100% | 82.8 MiB/s | 3.5 MiB | 00m00s [10/29] Installing python3-zipp-0:3.23. 100% | 13.1 MiB/s | 67.0 KiB | 00m00s [11/29] Installing python3-importlib-me 100% | 31.1 MiB/s | 190.8 KiB | 00m00s [12/29] Installing lua5.4-libs-0:5.5.0- 100% | 77.1 MiB/s | 394.9 KiB | 00m00s [13/29] Installing python3-lupa-0:2.6-3 100% | 83.2 MiB/s | 596.5 KiB | 00m00s [14/29] Installing python3-six-0:1.17.0 100% | 39.2 MiB/s | 120.3 KiB | 00m00s [15/29] Installing python3-dateutil-1:2 100% | 72.6 MiB/s | 891.5 KiB | 00m00s [16/29] Installing python3-shellingham- 100% | 12.2 MiB/s | 50.2 KiB | 00m00s [17/29] Installing python3-annotated-do 100% | 6.6 MiB/s | 20.1 KiB | 00m00s [18/29] Installing python3-cachetools-0 100% | 38.5 MiB/s | 157.7 KiB | 00m00s [19/29] Installing python3-decorator-0: 100% | 27.5 MiB/s | 84.5 KiB | 00m00s [20/29] Installing python3-pytz-0:2026. 100% | 44.8 MiB/s | 229.4 KiB | 00m00s [21/29] Installing python3-croniter-0:5 100% | 49.6 MiB/s | 203.3 KiB | 00m00s [22/29] Installing python3-prometheus_c 100% | 39.4 MiB/s | 443.4 KiB | 00m00s [23/29] Installing python3-py-key-value 100% | 121.1 KiB/s | 124.0 B | 00m00s [24/29] Installing python3-typer-0:0.24 100% | 16.1 MiB/s | 858.7 KiB | 00m00s [25/29] Installing python3-fakeredis+lu 100% | 121.1 KiB/s | 124.0 B | 00m00s [26/29] Installing python3-opentelemetr 100% | 26.8 MiB/s | 465.9 KiB | 00m00s [27/29] Installing python3-py-key-value 100% | 121.1 KiB/s | 124.0 B | 00m00s [28/29] Installing python3-json-logger- 100% | 22.1 MiB/s | 90.5 KiB | 00m00s [29/29] Installing python3-cloudpickle- 100% | 2.6 MiB/s | 133.5 KiB | 00m00s Complete! Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.7Of3kx + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + cd pydocket-0.17.9 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir --output /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires Handling hatchling from build-system.requires Requirement satisfied: hatchling (installed: hatchling 1.29.0) Handling hatch-vcs from build-system.requires Requirement satisfied: hatch-vcs (installed: hatch-vcs 0.5.0) Handling cloudpickle>=3.1.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: cloudpickle>=3.1.1 (installed: cloudpickle 3.1.2) Handling croniter>=5 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: croniter>=5 (installed: croniter 5.0.1) Handling exceptiongroup>=1.2.0; python_version < '3.11' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: exceptiongroup>=1.2.0; python_version < '3.11' Handling fakeredis[lua]>=2.32.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: fakeredis[lua]>=2.32.1 (installed: fakeredis 2.34.0) (extras are currently not checked) Handling opentelemetry-api>=1.33.0 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: opentelemetry-api>=1.33.0 (installed: opentelemetry-api 1.39.1) Handling prometheus-client>=0.21.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: prometheus-client>=0.21.1 (installed: prometheus-client 0.23.0) Handling py-key-value-aio[memory,redis]>=0.3.0 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: py-key-value-aio[memory,redis]>=0.3.0 (installed: py-key-value-aio 0.3.0) (extras are currently not checked) Handling python-json-logger>=2.0.7 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: python-json-logger>=2.0.7 (installed: python-json-logger 4.0.0) Handling redis>=5 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: redis>=5 (installed: redis 5.2.1) Handling rich>=13.9.4 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: rich>=13.9.4 (installed: rich 14.3.3) Handling taskgroup>=0.2.2; python_version < '3.11' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: taskgroup>=0.2.2; python_version < '3.11' Handling typer>=0.15.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: typer>=0.15.1 (installed: typer 0.24.1) Handling typing-extensions>=4.12.0 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: typing-extensions>=4.12.0 (installed: typing-extensions 4.15.0) Handling tzdata>=2025.2; sys_platform == 'win32' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: tzdata>=2025.2; sys_platform == 'win32' Handling opentelemetry-sdk>=1.33.0; extra == 'metrics' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: opentelemetry-sdk>=1.33.0; extra == 'metrics' + cat /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires + rm -rfv pydocket-0.17.9.dist-info/ removed 'pydocket-0.17.9.dist-info/METADATA' removed directory 'pydocket-0.17.9.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-pydocket-0.17.9-2.fc45.buildreqs.nosrc.rpm INFO: Going to install missing dynamic buildrequires Updating and loading repositories: fedora 100% | 15.2 KiB/s | 3.6 KiB | 00m00s Copr repository 100% | 7.2 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Nothing to do. Package "compat-lua-libs-5.1.5-31.fc44.ppc64le" is already installed. Package "pyproject-rpm-macros-1.18.7-1.fc45.noarch" is already installed. Package "python3-devel-3.14.3-1.fc45.ppc64le" is already installed. Package "python3-docker-7.1.0-10.fc44.noarch" is already installed. Package "python3-fakeredis-2.34.0-2.fc45.noarch" is already installed. Package "python3-pytest-8.4.2-3.fc45.noarch" is already installed. Package "python3-pytest-asyncio-1.1.0-3.fc44.noarch" is already installed. Package "python3-pytest-xdist-3.7.0-6.fc44.noarch" is already installed. Package "python3-cloudpickle-3.1.2-2.fc44.noarch" is already installed. Package "python3-croniter-5.0.1-7.fc45.noarch" is already installed. Package "python3-fakeredis-2.34.0-2.fc45.noarch" is already installed. Package "python3-fakeredis+lua-2.34.0-2.fc45.noarch" is already installed. Package "python3-hatch-vcs-0.5.0-7.fc45.noarch" is already installed. Package "python3-hatchling-1.29.0-1.fc45.noarch" is already installed. Package "python3-opentelemetry-api-1.39.1-1.fc45.noarch" is already installed. Package "python3-packaging-26.0-1.fc45.noarch" is already installed. Package "python3-pip-26.0.1-1.fc45.noarch" is already installed. Package "python3-prometheus_client-0.23.0-3.fc44.noarch" is already installed. Package "python3-py-key-value-aio-0.3.0-2.fc45.noarch" is already installed. Package "python3-py-key-value-aio+memory-0.3.0-2.fc45.noarch" is already installed. Package "python3-py-key-value-aio+redis-0.3.0-2.fc45.noarch" is already installed. Package "python3-json-logger-4.0.0-2.fc44.noarch" is already installed. Package "python3-redis-5.2.1-8.fc44.noarch" is already installed. Package "python3-rich-14.3.3-1.fc45.noarch" is already installed. Package "python3-typer-0.24.1-1.fc45.noarch" is already installed. Package "python3-typing-extensions-4.15.0-4.fc45.noarch" is already installed. Package "tomcli-0.10.1-4.fc44.noarch" is already installed. Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1773273600 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.lAVOXI + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + cd pydocket-0.17.9 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir --output /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires Handling hatchling from build-system.requires Requirement satisfied: hatchling (installed: hatchling 1.29.0) Handling hatch-vcs from build-system.requires Requirement satisfied: hatch-vcs (installed: hatch-vcs 0.5.0) Handling cloudpickle>=3.1.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: cloudpickle>=3.1.1 (installed: cloudpickle 3.1.2) Handling croniter>=5 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: croniter>=5 (installed: croniter 5.0.1) Handling exceptiongroup>=1.2.0; python_version < '3.11' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: exceptiongroup>=1.2.0; python_version < '3.11' Handling fakeredis[lua]>=2.32.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: fakeredis[lua]>=2.32.1 (installed: fakeredis 2.34.0) (extras are currently not checked) Handling opentelemetry-api>=1.33.0 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: opentelemetry-api>=1.33.0 (installed: opentelemetry-api 1.39.1) Handling prometheus-client>=0.21.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: prometheus-client>=0.21.1 (installed: prometheus-client 0.23.0) Handling py-key-value-aio[memory,redis]>=0.3.0 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: py-key-value-aio[memory,redis]>=0.3.0 (installed: py-key-value-aio 0.3.0) (extras are currently not checked) Handling python-json-logger>=2.0.7 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: python-json-logger>=2.0.7 (installed: python-json-logger 4.0.0) Handling redis>=5 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: redis>=5 (installed: redis 5.2.1) Handling rich>=13.9.4 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: rich>=13.9.4 (installed: rich 14.3.3) Handling taskgroup>=0.2.2; python_version < '3.11' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: taskgroup>=0.2.2; python_version < '3.11' Handling typer>=0.15.1 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: typer>=0.15.1 (installed: typer 0.24.1) Handling typing-extensions>=4.12.0 from hook generated metadata: Requires-Dist (pydocket) Requirement satisfied: typing-extensions>=4.12.0 (installed: typing-extensions 4.15.0) Handling tzdata>=2025.2; sys_platform == 'win32' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: tzdata>=2025.2; sys_platform == 'win32' Handling opentelemetry-sdk>=1.33.0; extra == 'metrics' from hook generated metadata: Requires-Dist (pydocket) Ignoring alien requirement: opentelemetry-sdk>=1.33.0; extra == 'metrics' + cat /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-buildrequires + rm -rfv pydocket-0.17.9.dist-info/ removed 'pydocket-0.17.9.dist-info/METADATA' removed directory 'pydocket-0.17.9.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.ErL8bS + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib64: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd pydocket-0.17.9 + mkdir -p /builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_wheel.py /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir Processing ./. Preparing metadata (pyproject.toml): started Running command Preparing metadata (pyproject.toml) Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: pydocket Building wheel for pydocket (pyproject.toml): started Running command Building wheel for pydocket (pyproject.toml) Building wheel for pydocket (pyproject.toml): finished with status 'done' Created wheel for pydocket: filename=pydocket-0.17.9-py3-none-any.whl size=94504 sha256=4a3205cc3607b727b26472e5b30a6ff2491f57f53465ff4bcd7ea8cdd28fb469 Stored in directory: /builddir/.cache/pip/wheels/a6/e8/0d/fb807c0d79f6ed229893882c4aec8a3062a0de48adb83c2299 Successfully built pydocket + RPM_EC=0 ++ jobs -p + exit 0 Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.dG7m6D + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + '[' /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT '!=' / ']' + rm -rf /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT ++ dirname /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT + mkdir -p /builddir/build/BUILD/python-pydocket-0.17.9-build + mkdir /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib64: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd pydocket-0.17.9 ++ ls /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir/pydocket-0.17.9-py3-none-any.whl ++ xargs basename --multiple ++ sed -E 's/([^-]+)-([^-]+)-.+\.whl/\1==\2/' + specifier=pydocket==0.17.9 + '[' -z pydocket==0.17.9 ']' + TMPDIR=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir + /usr/bin/python3 -m pip install --root /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT --prefix /usr --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir pydocket==0.17.9 Using pip 26.0.1 from /usr/lib/python3.14/site-packages/pip (python 3.14) Looking in links: /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir Processing /builddir/build/BUILD/python-pydocket-0.17.9-build/pyproject-wheeldir/pydocket-0.17.9-py3-none-any.whl Installing collected packages: pydocket Creating /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin changing mode of /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin/docket to 755 Successfully installed pydocket-0.17.9 + '[' -d /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin ']' + '[' -z sP ']' + shebang_flags=-kasP + /usr/bin/python3 -B /usr/lib/rpm/redhat/pathfix.py -pni /usr/bin/python3 -kasP /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin/docket /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin/docket: updating + rm -rfv /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin/__pycache__ + rm -f /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-ghost-distinfo + site_dirs=() + '[' -d /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages ']' + site_dirs+=("/usr/lib/python3.14/site-packages") + '[' /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib64/python3.14/site-packages '!=' /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages ']' + '[' -d /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib64/python3.14/site-packages ']' + for site_dir in ${site_dirs[@]} + for distinfo in /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT$site_dir/*.dist-info + echo '%ghost %dir /usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info' + sed -i s/pip/rpm/ /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info/INSTALLER + PYTHONPATH=/usr/lib/rpm/redhat + /usr/bin/python3 -B /usr/lib/rpm/redhat/pyproject_preprocess_record.py --buildroot /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT --record /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info/RECORD --output /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-record + rm -fv /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info/RECORD removed '/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info/RECORD' + rm -fv /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info/REQUESTED removed '/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/pydocket-0.17.9.dist-info/REQUESTED' ++ cut -f1 '-d ' ++ wc -l /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-ghost-distinfo + lines=1 + '[' 1 -ne 1 ']' + RPM_FILES_ESCAPE=4.19 + /usr/bin/python3 /usr/lib/rpm/redhat/pyproject_save_files.py --output-files /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-files --output-modules /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-modules --buildroot /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT --sitelib /usr/lib/python3.14/site-packages --sitearch /usr/lib64/python3.14/site-packages --python-version 3.14 --pyproject-record /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-record --prefix /usr -l docket + /usr/lib/rpm/check-buildroot + /usr/lib/rpm/redhat/brp-ldconfig + COMPRESS='gzip -9 -n' + COMPRESS_EXT=.gz + /usr/lib/rpm/brp-compress + /usr/lib/rpm/brp-strip /usr/bin/strip + /usr/lib/rpm/brp-strip-comment-note /usr/bin/strip /usr/bin/objdump + /usr/lib/rpm/redhat/brp-strip-lto /usr/bin/strip + /usr/lib/rpm/check-rpaths + /usr/lib/rpm/redhat/brp-mangle-shebangs + /usr/lib/rpm/brp-remove-la-files + /usr/lib/rpm/redhat/brp-python-rpm-in-distinfo + env /usr/lib/rpm/redhat/brp-python-bytecompile '' 1 0 -j5 Bytecompiling .py files below /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14 using python3.14 + /usr/lib/rpm/redhat/brp-python-hardlink + /usr/bin/add-det --brp -j5 /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/tasks.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/testing.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/instrumentation.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_result_store.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/strikelist.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/execution.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/docket.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/annotations.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/agenda.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/worker.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_uuid7.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_telemetry.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_prometheus_exporter.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/strikelist.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_cancellation.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/__main__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_redis.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_execution_progress.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_redis.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/__pycache__/_docket_snapshot.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_retry.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_base.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_cron.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_functional.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_timeout.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_resolution.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_progress.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_contextual.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_resolution.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_contextual.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_progress.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_perpetual.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_concurrency.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/cli/__pycache__/_support.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/dependencies/__pycache__/_concurrency.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/cli/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents Scanned 14 directories and 103 files, processed 38 inodes, 38 modified (12 replaced + 26 rewritten), 0 unsupported format, 0 errors + /usr/bin/linkdupes --brp /builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr Scanned 13 directories and 103 files, considered 103 files, read 0 files, linked 0 files, 0 errors sum of sizes of linked files: 0 bytes Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.5ItPqM + umask 022 + cd /builddir/build/BUILD/python-pydocket-0.17.9-build + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/lib64/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib64: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd pydocket-0.17.9 + '[' '!' -f /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-modules ']' + PATH=/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin + PYTHONPATH=/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib64/python3.14/site-packages:/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages + _PYTHONSITE=/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib64/python3.14/site-packages:/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages + PYTHONDONTWRITEBYTECODE=1 + /usr/bin/python3 -sP /usr/lib/rpm/redhat/import_all_modules.py -f /builddir/build/BUILD/python-pydocket-0.17.9-build/python-pydocket-0.17.9-2.fc45.ppc64le-pyproject-modules Check import: docket Check import: docket.agenda Check import: docket.annotations Check import: docket.cli Check import: docket.dependencies Check import: docket.docket Check import: docket.execution Check import: docket.instrumentation Check import: docket.strikelist Check import: docket.tasks Check import: docket.testing Check import: docket.worker + export REDIS_VERSION=memory + REDIS_VERSION=memory + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + PATH=/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin + PYTHONPATH=/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib64/python3.14/site-packages:/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages + PYTHONDONTWRITEBYTECODE=1 + PYTEST_ADDOPTS=' --ignore=/builddir/build/BUILD/python-pydocket-0.17.9-build/.pyproject-builddir' + PYTEST_XDIST_AUTO_NUM_WORKERS=5 + /usr/bin/pytest --ignore tests/instrumentation/test_tracing.py -k 'not test_exports_metrics_as_prometheus_metrics and not test_json_logging_format' ============================= test session starts ============================== platform linux -- Python 3.14.3, pytest-8.4.2, pluggy-1.6.0 rootdir: /builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9 configfile: pyproject.toml plugins: asyncio-1.1.0, xdist-3.7.0 asyncio: mode=Mode.AUTO, asyncio_default_fixture_loop_scope=function, asyncio_default_test_loop_scope=function collected 688 items / 2 deselected / 686 selected tests/cli/test_clear.py sssssssss [ 1%] tests/cli/test_iterate_with_timeout.py ..... [ 2%] tests/cli/test_module.py . [ 2%] tests/cli/test_parsing.py ssssssssss [ 3%] tests/cli/test_snapshot.py sssssssssssss [ 5%] tests/cli/test_striking.py sssssssssssssssss [ 8%] tests/cli/test_tasks.py sss [ 8%] tests/cli/test_url_validation.py .............. [ 10%] tests/cli/test_version.py .. [ 10%] tests/cli/test_watch.py ssssssssssss [ 12%] tests/cli/test_worker.py ..sFEFEss [ 13%] tests/cli/test_workers.py ss [ 13%] tests/concurrency_limits/test_basic.py FEFEFEFEFEFEFE.FE....FE [ 15%] tests/concurrency_limits/test_errors_and_resilience.py FEFEFEFEFEFEFE [ 16%] tests/concurrency_limits/test_execution_patterns.py FEFEFEFEFEFEFE [ 17%] tests/concurrency_limits/test_redelivery.py FEFEFEFEFEFE [ 18%] tests/concurrency_limits/test_worker_mechanics.py FEFEFEFEFEFEFEFEFEFE [ 20%] tests/fundamentals/test_async_dependencies.py FEFEFEFEFEFEFEFEFE [ 21%] tests/fundamentals/test_builtin_tasks.py FEFE [ 21%] tests/fundamentals/test_cancellation.py FEFEFE [ 22%] tests/fundamentals/test_context_injection.py FEFEFEFE [ 22%] tests/fundamentals/test_cron.py FEFEFEFEFEFEFE [ 23%] tests/fundamentals/test_errors.py .FE [ 24%] tests/fundamentals/test_idempotency.py FEFEFEFE [ 24%] tests/fundamentals/test_logging.py FEFEFE [ 25%] tests/fundamentals/test_perpetual.py FEFEFEFEFEFEFEFE [ 26%] tests/fundamentals/test_progress_state.py FEFEFE [ 26%] tests/fundamentals/test_results.py FE [ 26%] tests/fundamentals/test_retries.py FEFEFEFEFEFE [ 27%] tests/fundamentals/test_scheduling.py FEFEFEFEFEFEFE [ 28%] tests/fundamentals/test_self_perpetuation.py FEFEFE [ 29%] tests/fundamentals/test_shared_dependencies.py FEFEFEFEFEFEFEFEFEFEFEFEF [ 31%] E [ 31%] tests/fundamentals/test_striking.py FEFF [ 31%] tests/fundamentals/test_sync_dependencies.py FEFEFEFEFEFE [ 32%] tests/fundamentals/test_timeouts.py FEFEFEFEFE [ 33%] tests/instrumentation/test_counters.py FEFEFEFEFEFEFEFEFEFEFEFE [ 34%] tests/instrumentation/test_export.py FEFEFEFE.. [ 35%] tests/test_agenda.py ...FEFEFEFEFE.FEFEFE.FE.FEFE. [ 38%] tests/test_cancellation.py FEFEFEFEFEFEFEFFEFEFEFE [ 40%] tests/test_dependencies_advanced.py FEFEFEFEFEFEFEFEFEFEFEFE.. [ 42%] tests/test_dependencies_core.py FEFEFEFEFEFEFEFEFEFEFE [ 43%] tests/test_dependency_uniqueness.py .... [ 44%] tests/test_docket_clear.py .FEFEFEFE.FEFEF..FF [ 46%] tests/test_docket_execution.py FE..FEFEFEFEFE...FEFEFFEFEFE [ 48%] tests/test_docket_keys.py ................................. [ 53%] tests/test_docket_registration.py ..F.......FE. [ 55%] tests/test_execution.py ............ [ 57%] tests/test_execution_state.py FEFEFEFEFEFFEFEFE...FE [ 59%] tests/test_fallback_task.py FEFEFEFEFEFEFEFE [ 60%] tests/test_handler_semantics.py FEFEFEFEFE [ 60%] tests/test_key_leak_protection.py FEFEFEFEFEFE [ 61%] tests/test_memory_backend.py FFFFF [ 62%] tests/test_perpetual_race.py FEFEFEFEFEFEFEFEFEFEFEFEFEFE [ 64%] tests/test_perpetual_state.py FFEFFEFEFEFF [ 65%] tests/test_progress_basics.py .E.E...E.EFEFEFEFE.E [ 67%] tests/test_progress_pubsub.py .EFE.EFEFEFEFEFE [ 68%] tests/test_redelivery.py FEFEFEFEFEFEFEFEFE. [ 69%] tests/test_results_retrieval.py FEFEFEFEFE..FEFEFEFE.FE [ 71%] tests/test_results_storage.py FEFEFEFEFEFEFEFE...sssssss [ 74%] tests/test_strikelist.py ....................... [ 77%] tests/test_striking.py ..........................FEFEFEFEFEFEFEFE.FE. [ 83%] tests/test_testing.py FEFEFEFEFEFEFEFEFEFEFE.FEFEFEFEFE.FEFE.FEFEFEFEFE. [ 87%] [ 87%] tests/test_uuid7.py .............................. [ 91%] tests/worker/test_bootstrap.py FEFEFE.FFFF [ 92%] tests/worker/test_core.py FEFEFEFEFEFE.....FEFEFE [ 94%] tests/worker/test_invariants.py FEFEFEFE.FEFE. [ 95%] tests/worker/test_lifecycle.py FEFEFE......FE [ 97%] tests/worker/test_scheduling.py FEFEFE.FEFEFEFEFFFEFE. [ 99%] tests/worker/test_ttl_zero.py FFFFF [100%] ==================================== ERRORS ==================================== ________________ ERROR at teardown of test_rich_logging_format _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:runs:019ce307-0457-7247-8896-e79ba3e3ed20']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_plain_logging_format ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-afd47a56-fa08-4028-8f40-230bd67ccc53:runs:019ce307-0653-73e4-999e-ae7fcc54b059']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_basic_concurrency_limit _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-66d1acb5-4d25-4cdb-adc6-3aabdb10853c:runs:019ce307-07ca-7516-816d-f4cbd8dfc32b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_per_task_concurrency_limit _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-775bdb55-084f-4696-8579-f98e4054247c:runs:019ce307-0937-74d7-8934-c800cbc26d9f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_concurrency_limit_single_argument __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1d6e0376-79ac-4210-b5a4-b95cc2174a3b:runs:019ce307-0aa9-7084-9b0d-f8cdd2795a56']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_concurrency_limit_different_arguments ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d72fc07c-7542-4547-a7c7-5aabd3e4c83f:runs:019ce307-0c25-722a-81a5-68746e310197']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_concurrency_limit_max_concurrent __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-229e7a0f-d4c3-414b-90b8-b63661529670:runs:019ce307-0d93-77c5-a251-dd45950ddc42']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_concurrency_limit_missing_argument_error ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a9523647-a503-4cf2-b4c7-1b21b35c024a:runs:019ce307-0f03-7041-bd41-f066e71c614e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_concurrency_limit_with_custom_scope _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3508ca96-a5f3-41ad-8b35-06ef6e5bf538:runs:019ce307-1077-7437-8348-28e207e1409d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_concurrency_limit_without_concurrency_dependency __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f7366860-2132-457d-856b-037121df2d15:runs:019ce307-1287-7361-a17d-ba63e0aad5bc']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_concurrency_keys_are_handled ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8b58e753-afca-4235-a20f-4a6aba925f87:runs:019ce307-15cb-77b4-b865-7b7f5f4ca515']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_worker_concurrency_with_task_failures ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5c478525-07c7-4dff-8b2b-37a5af2160aa:runs:019ce307-1732-707e-875d-3216b5797400']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_concurrency_error_handling_during_execution _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a9af7796-72ed-411e-bdb2-4bc3b9c027cf:runs:019ce307-189f-7213-81cd-49cf5341af86']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_multiple_workers_coordination __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-127dcd98-5975-4a82-80a5-c6086727e200:runs:019ce307-1a0d-7466-b891-26b502cf10d0']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_refresh_handles_redis_errors ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7b0e8dc4-5acb-4d11-b891-3d90bfae4c90:runs:019ce307-1b8a-7250-b95e-728014f54973']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_worker_concurrency_robustness_under_stress _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c010fd2e-10c2-4350-8f0b-e1dbd90b859d:runs:019ce307-1cf0-7027-8c75-5a3a714669d5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_worker_concurrency_edge_cases ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ecc0b3f0-5850-44f4-bb86-33fe5bc79763:runs:019ce307-1e55-7786-bf45-660d4f1b6bce']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_graceful_shutdown_with_concurrency_management _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d5760bba-9a3a-4307-bbaf-63b151004ad6:runs:019ce307-1fbd-75c5-99f3-69ae863b10b4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_limits_task_queuing_behavior ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-776139d2-862b-431e-ab50-54c782b6e012:runs:019ce307-212c-76d7-8539-e21393b8616a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_worker_concurrency_different_customer_branches ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5f80bbf3-00a4-4327-8832-5648f86e29bc:runs:019ce307-22a0-716d-8fb1-570cd61fc40e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_worker_concurrency_limits_different_scopes _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f14636e3-573d-4a08-9e5f-f3b49c3dbde1:runs:019ce307-2405-7322-af4e-f488a7b34e24']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_refresh_mechanism_integration __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fe32c93b-817b-4efc-b945-a296f757f224:runs:019ce307-2571-7027-9965-9c094978d9ec']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_worker_concurrency_with_quick_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1767f822-bcd6-488c-8ada-bf1446c9b3bb:runs:019ce307-26e1-7688-9bfd-9e38632543f4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_with_dependencies_integration __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8dcfc9f6-71bf-4d61-a517-9b95adbc2e69:runs:019ce307-2864-71b7-9726-e1bfb112d462']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_concurrency_limited_task_successfully_acquires_slot _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-df598554-5a0e-4827-b99d-ec17cebacba6:runs:019ce307-29cd-7064-8e57-b1d5a8a2ce5c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_task_timeout_with_explicit_timeout _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-48b4a4d0-4c6c-4017-8941-cbcba453864b:runs:019ce307-2b34-7337-8e76-baf5a186704d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_task_timeout_with_concurrent_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c0995d81-ba8b-4b18-9441-e07dac7c4f8d:runs:019ce307-2c9d-74cc-bf3b-98c8a73961e1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_explicit_timeout_limits_long_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-02c1562f-1bee-4d33-84ce-e630b29f4bc3:runs:019ce307-2e0e-77dd-8ce1-6fb691478b18']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_short_tasks_complete_within_timeout _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-06286318-97a9-4b27-9050-fb98878fb7a5:runs:019ce307-2f94-73b9-af67-19115ffd7458']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_redeliveries_respect_concurrency_limits _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fb37db03-1114-49ec-8f2b-8ac5faf9810b:runs:019ce307-310b-7105-b7e4-2cd21b767831']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_concurrency_blocked_task_executes_exactly_once ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9101c98b-df26-4f25-8fb2-e903219c064d:runs:019ce307-3275-7304-9806-42c27ad75a46']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_worker_concurrency_missing_argument_fails_task ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-272d0f0b-7a83-4032-bab1-c690a3617ca0:runs:019ce307-33df-7161-9d31-c87224d9b90f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_worker_concurrency_no_limit_early_return ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6699de3e-890b-48b7-be29-459e71ba04c5:runs:019ce307-354f-77ce-8bc2-30cf7a41e663']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_concurrency_missing_argument_shows_available_args _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c390901e-f6df-4383-8810-f3e236e3a58a:runs:019ce307-36ce-76cc-93ce-cab35d616a2d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_worker_concurrency_cleanup_on_success ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-dd127e1e-c384-4a14-92e7-31c1b2df4784:runs:019ce307-3834-71a2-b2dd-181d62910145']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_worker_concurrency_cleanup_on_failure ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9f23a6b3-b820-4b09-8980-18ec4bacf3bd:runs:019ce307-399e-7272-b560-ac22574a6352']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_cleanup_after_task_completion __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fc3fda96-96a7-4c13-b610-b91ad349231f:runs:019ce307-3b0d-724c-b646-527f59f594f4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_handles_concurrent_task_cleanup_gracefully __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-06d6388b-dd09-49c5-9756-7f334e13a8a4:runs:019ce307-3c7d-74c3-80af-ac0e9e8dca42']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_finally_block_releases_concurrency_on_success ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7b27217e-ccfb-4f95-b32e-f14215c4b5d7:runs:019ce307-3e03-762f-9692-4d387a24ffde']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_stale_concurrency_slots_are_scavenged_when_full ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8e7287eb-5245-4d9d-884d-96b1bae391f4:concurrency:customer_id:123', 'test-docket-8e7287eb-5245-4d9d-884d-96b1bae391f4:runs:019ce307-3f6d-77ff-b585-56ae80a3398e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_graceful_shutdown_releases_concurrency_slots ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-cc11799b-5692-471a-a053-3d3036c1e424:runs:019ce307-40d7-7064-84a6-0d8cf2107cee']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_simple_function_dependencies ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e5ec5986-995a-4d74-aa68-e83b34355dfa:runs:019ce307-4243-746d-875d-a4bf7a632179']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_contextual_dependencies _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-193f88d0-f452-41ad-afa4-a0ec7aa1b520:runs:019ce307-43b7-7376-a35d-5e753dbf852a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_dependencies_of_dependencies ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4b8ee664-54e3-4511-b484-ef2f8d0ae7b2:runs:019ce307-453f-77fe-9281-7f740c9c3448']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_dependencies_can_ask_for_docket_dependencies ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-766f66a2-b8aa-462a-a3dd-72537f46f789:runs:019ce307-46ad-7667-9746-d03cfbad70d4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_dependency_failures_are_task_failures ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-de5d65be-11da-418c-b1b7-f828921c2ce4:runs:019ce307-481b-71be-8f02-527af1a129be']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_contextual_dependency_before_failures_are_task_failures _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-213034d1-c8af-4fa3-9926-1d4ab2732954:runs:019ce307-4989-71cc-896d-b4ecec5127c4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_contextual_dependency_after_failures_are_task_failures _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d0296c86-780d-492c-8970-f908b158d767:runs:019ce307-4aff-701c-a1d9-208558be8962']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_dependencies_can_ask_for_task_arguments _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-afdfc0be-805a-4b31-8041-b26223c35143:runs:019ce307-4c85-77a4-a2f4-9680c5c21d01']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_task_arguments_may_be_optional ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c9d8f0c8-24d4-4763-be1b-be1be2908dab:runs:019ce307-4dfb-74a3-a0d6-7fa43ffe7062']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_all_dockets_have_a_trace_task ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6aee0070-78ba-4c1b-b43c-372e8b0d3fa5:runs:019ce307-4f77-7599-befb-2cad29e6e0cb']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_all_dockets_have_a_fail_task ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c89213d2-73a4-4b92-a862-3f4f844b8c33:runs:019ce307-50f2-73cb-a604-b548bba3e8b8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_cancelling_future_task _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-728cbc86-7027-4053-bfd5-30b1dd2b8d00:runs:019ce307-526f-7363-8d69-610a8da68f11']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_cancelling_immediate_task ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0df3f3a9-f6c0-4885-892e-b3c0537e9ac7:runs:019ce307-53f6-7041-a9e3-6041220ee8c6']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_cancellation_is_idempotent _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0dd2b069-6818-4b7e-a502-197c4e59754b:runs:test-task:52d846f2-d8fe-4891-9e5b-84a21a5943bb']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_supports_requesting_current_docket _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-73bf1330-a69f-4ec9-880d-6b229065f856:runs:019ce307-56e5-74dd-8020-56373f6d1437']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_supports_requesting_current_worker _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-80658510-fe64-4605-8aa7-b48ca1f02d36:runs:019ce307-5869-7589-8192-1b313555bafe']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_supports_requesting_current_execution ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bbf01efe-6628-40e5-8d7b-ebe32d4d6286:runs:my-cool-task:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_supports_requesting_current_task_key ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6a42c383-854b-463d-ae3a-b4a892d06dbd:runs:my-cool-task:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_cron_task_reschedules_itself ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c1983869-8d49-4b62-b555-f0d829d0e8b3:runs:019ce307-5cee-71b1-b3a3-ea18efcea0db']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_cron_tasks_are_automatically_scheduled _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2e6087a8-8a82-4c92-9230-ee4e429b803c:runs:my_automatic_cron']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* my_automatic_cron(...) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * my_automatic_cron(...) __________ ERROR at teardown of test_cron_tasks_continue_after_errors __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fcf40dbf-c24f-4ce1-9965-5575d4efb2fb:runs:019ce307-5f6b-71b3-8a14-20361f482cbe']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_cron_tasks_can_cancel_themselves __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b9ed51a2-f807-481b-a652-df9910ddc644:runs:019ce307-60eb-76e2-8f66-433934c88e74']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_cron_supports_vixie_keywords ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f87a639e-3f2c-4a5f-ad42-6c4f7ffd92fd:runs:019ce307-626c-7260-8d43-3bd6062e8103']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_automatic_cron_waits_for_scheduled_time _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-778ee7d7-6366-49e4-a3ae-f10b55a64ed0:runs:scheduled_task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* scheduled_task(...) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * scheduled_task(...) _________________ ERROR at teardown of test_cron_with_timezone _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2499d150-c996-417c-860f-1dc8c8459c6e:runs:019ce307-64d2-748e-b114-63a84aff6304']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_adding_task_with_unbindable_arguments ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-17cf5e2a-db72-4658-98b3-34495c74b7e6:runs:019ce307-66ef-7297-816b-fe0346d8613b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_adding_is_idempotent ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b261c84c-0440-4fb2-a54f-147b37410e90:runs:my-cool-task:4a5912ce-4baa-4a75-b2de-e4b0dc711ca1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_task_keys_are_idempotent_in_the_future _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-03a9c6d1-0efd-4d70-b793-1282de6fa852:runs:my-cool-task:64727be6-3c04-4c1f-9a02-28e9b210d247']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_task_keys_are_idempotent_between_the_future_and_present _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e7369666-78bb-4a42-825c-654f7bb43b4e:runs:my-cool-task:f330a6bd-b5c4-4be3-a343-3ac47aa048b2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_task_keys_are_idempotent_in_the_present _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-25a8768e-0ce1-4d30-b2fa-bcf45dae8860:runs:my-cool-task:f369476e-6521-4f35-a9c2-c084ab5bef9a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_tasks_can_opt_into_argument_logging _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-747372f8-500b-458a-9470-8efe31d3d524:runs:019ce307-6e88-71a3-8138-370807b03f54']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_tasks_can_opt_into_logging_collection_lengths ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b81b6996-328b-47ae-8552-0fb6cf9b4dc3:runs:019ce307-7009-7244-888c-e6e323b23f20']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_logging_inside_of_task _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3dbb8dcf-4fc0-43d3-9e13-eeaac6c6a59e:runs:my-cool-task:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________________ ERROR at teardown of test_perpetual_tasks ___________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e32a1595-dd2f-431b-86ef-a3dc56de5bbc:runs:019ce307-731c-7782-8498-2812d4da2974']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_perpetual_tasks_can_cancel_themselves ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a050994a-189e-4721-ae23-cbfbf19d8633:runs:019ce307-749d-7021-8656-7ebd67ca5984']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_perpetual_tasks_can_change_their_parameters _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1b85f91b-c98a-4bfb-bfde-4a67c7188df2:runs:019ce307-7620-7513-ad4a-8a94fa16b8c7']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_perpetual_tasks_perpetuate_even_after_errors ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a29ebfbf-92dc-42c8-9073-3b6d66427329:runs:019ce307-77c0-7584-a008-9265fa6efda3']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_perpetual_tasks_can_be_automatically_scheduled ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8c0a39ed-44f3-4693-ba66-276b609c4485:runs:my_automatic_task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* my_automatic_task(...) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * my_automatic_task(...) _ ERROR at teardown of test_perpetual_tasks_can_schedule_next_run_after_delay __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e9a551e5-2f2f-400d-95dd-7102c6af2107:runs:019ce307-7a35-74a4-950f-0b4aaba5aa7c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_cancelled_automatic_perpetual_can_be_rescheduled __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9b258d62-b8a7-4352-8569-38d77bcf83e8:runs:my_auto_task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_perpetual_tasks_can_schedule_next_run_at_specific_time _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-12114f41-dc4d-4616-8fc0-b2bf17c3f685:runs:019ce307-7d46-7786-9367-7c7d2d896dad']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_tasks_can_report_progress ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-94de9e9c-1355-4e79-b795-250053162b63:runs:progress-task:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_tasks_can_access_execution_state __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ed214687-8df1-4648-be27-a0ebb33e9940:runs:stateful-task:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_execution_state_lifecycle ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bcd1eed9-4298-425f-9ffa-7068dfc07e33:runs:success:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_task_results_can_be_stored_and_retrieved ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-65b3264c-5676-4141-84a3-a42d4479de74:runs:019ce307-83a7-76f6-8c26-24468190bb41']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________________ ERROR at teardown of test_errors_are_logged __________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4c26b422-2a15-46e2-8487-907833622635:runs:019ce307-8540-7547-b95d-718375178d08']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_supports_simple_linear_retries ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6520b344-f6b0-4af6-9111-5e67fb676f3e:runs:019ce307-86e2-7471-acec-b72e2d5531af']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_supports_simple_linear_retries_with_delay ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c721b7da-44b7-4a00-a0ae-b9af826c7d62:runs:019ce307-8886-734f-b01c-12311acdbe11']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_supports_infinite_retries ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-cfde6278-497f-4fc1-9c3d-af8a8683571f:runs:019ce307-8a30-73d9-894f-663ccbf3abc1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_supports_exponential_backoff_retries ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0dc3b84e-2bf6-44df-99ac-7e895f8a4264:runs:019ce307-8bca-7502-bed0-2dd71bc49ce3']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_supports_exponential_backoff_retries_under_maximum_delay _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-242191a6-611b-4349-9cfd-7a942ff871f6:runs:019ce307-8d66-7495-842e-4341b849b397']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_immediate_task_execution ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0e2a943d-8586-43f1-9048-512169f81acd:runs:019ce307-8f20-7091-94a8-2f653ab74dfe']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_immediate_task_execution_by_name __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ddb319c7-3586-45c2-a76c-831dd19da734:runs:019ce307-90c1-7529-832c-902700af4a21']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_scheduled_execution _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7902b22e-3d06-4c27-90ed-e12d2d58a697:runs:019ce307-925e-7641-9068-7423114a9f9c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________________ ERROR at teardown of test_rescheduling_later _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c56d2a6b-3ee2-4b4c-8583-293d58914bf4:runs:my-cool-task:f2915d77-c2e5-45ee-8150-c40220875a8d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_rescheduling_earlier ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b4553534-f7cb-49a1-8322-980c00e6f2b0:runs:my-cool-task:653e7a72-45f1-4739-af16-7f64debb7e78']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_rescheduling_by_name ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e2a15e51-7c32-4735-93b3-9af62ee7fe28:runs:my-cool-task:44cb1c94-bfe9-4faa-a436-e3d545b3caad']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_replace_without_existing_task_acts_like_add _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7476ecb8-3b0d-447d-879b-fd8852aa7efb:runs:my-cool-task:6e433c77-ed48-44d9-9774-c027d4a6840d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_self_perpetuating_immediate_tasks __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f7df5e8c-9158-4419-a7a5-40e0b9d0b1eb:runs:first']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_self_perpetuating_scheduled_tasks __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0520ba25-deca-46a0-9b32-c70eb9f70f20:runs:first']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_infinitely_self_perpetuating_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7fda497d-addb-4b6c-94de-23853cff0812:runs:first']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_shared_dependency_is_initialized_once ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9af13210-327f-46c7-81a4-ea388961cde3:runs:019ce307-9f8c-70b4-807b-c89437aab13f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_shared_dependencies_are_same_instance ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6ab59123-757b-4651-b1c5-51a8e9dc3b8d:runs:019ce307-a137-76d2-b967-686a1755afc7']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_shared_identity_is_factory_function _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9d72d9c6-cc36-479c-b995-72cf2fd041cf:runs:019ce307-a2ee-76bc-8284-658fce578693']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_shared_cleanup_on_worker_exit ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c23f8f0b-0f81-48bb-adbc-597d084e3bd5:runs:019ce307-a492-74c1-9478-f981a7fcedc2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_shared_depending_on_shared _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-63d5921d-b2fc-417a-89ed-57799ddf4c54:runs:019ce307-a635-7291-a4d9-654f648b3658']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_shared_depending_on_depends _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d0f76a77-b2a2-4148-b4ea-7475f9ef1e13:runs:019ce307-a7e8-70c4-825b-2223bee0af29']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_shared_can_access_current_docket_and_worker _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5f7872d1-f3ad-4ebe-b3c1-b1938d95c448:runs:019ce307-a9c1-714d-a860-16622b0e422f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_late_registered_task_with_new_shared ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5f206aca-8ecb-42dd-8cb0-86864c50d58e:runs:019ce307-ab8d-76e0-bc1d-236fb6c38b79']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_multiple_shared_cleanup_order ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-080de3f1-d6ca-47a3-9dbf-58eb31ba4f9f:runs:019ce307-ad47-76bd-b446-89c1cc3f6702']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_shared_cleanup_on_init_failure ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0e40a151-0473-4442-9dc5-9c817dcb2e5b:runs:019ce307-aefc-71a6-a1df-ccb1bd99c82d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_shared_async_function_factory ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8ee21f8d-46f4-49c9-9a5c-51c4bf8c7b40:runs:019ce307-b0a6-7119-ad22-0dc54c6503da']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_shared_sync_function_factory ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-77c04992-b227-4400-ac0b-986c852b5ab4:runs:019ce307-b26c-72e6-a437-51baa51554b8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_shared_sync_context_manager_factory _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-82da3cbf-155f-49ff-8b6e-c238b91f5ee8:runs:019ce307-b411-7320-b1d0-9f647c5b54b6']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_striking_entire_tasks ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7b4055ec-d950-4a45-bf06-de7ecdb4d981:runs:019ce307-b5d4-74ce-ba8a-839dace6b27c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_sync_function_dependencies _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-115fc633-a1c4-4cca-989a-4b33447ef6fa:runs:019ce307-ba9d-757b-95d1-32064dffc024']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_sync_contextual_dependencies ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1f0307a7-3163-4570-9f72-1bb8b62ca587:runs:019ce307-bcad-7502-b70a-6b3925c6d73a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_mixed_sync_and_async_dependencies __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a5d3baa3-e5bc-4538-9225-5b8c8e951bfc:runs:019ce307-be86-75b4-bb9e-9429ac99767a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_sync_dependencies_of_dependencies __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-40210b04-0b82-4c94-8527-4e616dca2f29:runs:019ce307-c06a-75b8-a211-e8c936bd27b9']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_sync_dependencies_can_ask_for_docket_dependencies __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4f969c5c-0a91-47a5-9366-6eeedcf2e98c:runs:019ce307-c26d-74e3-bc9e-3b1968fb0348']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_mixed_sync_async_nested_dependencies ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6599a151-a841-4bf5-b90e-c00f0111b820:runs:019ce307-c466-76c8-9fa5-1a331f788b30']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________________ ERROR at teardown of test_simple_timeout ___________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-193d0691-bd78-4619-94cd-f246e1469a99:runs:019ce307-c654-7261-9e34-6763fcacf957']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_simple_timeout_cancels_tasks ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6039a6c7-451a-43ed-9a39-97c69534c6cf:runs:019ce307-c810-7047-b1a6-26b19110ea7d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_timeout_can_be_extended _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-21074d93-5693-4645-88dc-a9be38bfae9f:runs:019ce307-c9dd-7706-8d50-8a5661155f25']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_timeout_extends_by_base_by_default _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fdf6ccb9-3d1b-4e99-9b2c-ca0d7621296f:runs:019ce307-cb9a-7615-aa51-05da53b8adf4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_timeout_is_compatible_with_retry __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-df10dc40-239c-4bf9-8b65-b57519b4cff6:runs:019ce307-cd64-744e-8507-23260a09e4a6']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_adding_a_task_increments_counter __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e09e1595-eb86-45a0-95b4-a3e7cca1d82f:runs:019ce307-cf2d-74cb-b851-0af4bf64af8c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_replacing_a_task_increments_counter _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b27bea9d-a2b6-438d-9282-70a33ed12ea9:runs:test-replace-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_cancelling_a_task_increments_counter ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-599320e8-b64f-4dc2-aef9-2271a224b4c2:runs:test-cancel-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_worker_execution_increments_task_counters ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e8d432ac-3c23-48f5-8bc4-3b835caea059:runs:019ce307-d4e0-70d1-b88f-86b5261fad2e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_failed_task_increments_failure_counter _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-854eaabe-6f34-4bf9-b650-26b1517a8c97:runs:019ce307-d6c9-73cc-b3a2-56afa642eb02']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_retried_task_increments_retry_counter ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d3f1bd51-a402-4680-9a8c-e410fd63b98e:runs:019ce307-d8af-74f8-9ba9-dc3cd57bcfc8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_exhausted_retried_task_increments_retry_counter ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-07696aee-a252-492e-b181-957e3e47de9a:runs:019ce307-dac1-7139-b10d-f45b2e3e95cb']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_retried_task_metric_uses_bounded_labels _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-edc4d4cf-abb5-4624-aa69-a94569b92a81:runs:019ce307-dce0-7556-bb4d-e609a10eca2b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_perpetuated_task_metric_uses_bounded_labels _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d68608fa-4137-49c5-bec6-aa92318d12b9:runs:019ce307-debe-726e-8e73-5352c20b6695']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_redelivered_tasks_increment_redelivered_counter ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7e2e4eca-8523-49d6-b4ae-ca26914472bb:runs:019ce307-e096-706d-9574-11751935bb7a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_superseded_task_increments_superseded_counter ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-08416308-c673-4b41-a458-f65e97651944:runs:metrics-superseded']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_replaced_task_only_counts_replacement ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5ba11c61-a506-45a2-8016-c4d7ffbd7f08:runs:metrics-replace']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_task_duration_is_measured ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a36328c1-8cb9-4fb7-89c9-0c3db1d6b118:runs:019ce307-e666-72ea-81cd-0fd60aef4d83']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_task_punctuality_is_measured ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-aa023f10-5028-4667-9bc3-6b18d850ffff:runs:019ce307-e841-744b-a992-a4f3b6d7416b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_task_running_gauge_is_incremented __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-80f9868f-4b71-4b04-9138-1f5730affea3:runs:019ce307-ea18-745a-9ced-b7d7140630c5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_worker_publishes_depth_gauges ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-48538f15-cfe4-4f2d-b3b0-f885d02e90b2:runs:019ce307-ebf0-73b4-9041-e4adfb47a0d1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_agenda_scatter_basic ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fd9ab224-be74-4f9e-b319-273a18d87356:runs:019ce307-f462-77df-9b91-9dde27c11fa1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_agenda_scatter_with_start_time ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-609f7932-9fe6-43ce-ab73-b15d362400f4:runs:019ce307-f673-70b4-9081-28f8e2c7e4f4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_agenda_scatter_with_jitter _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-232c0cad-e457-432d-870a-70fbace641a4:runs:019ce307-f863-7757-9dc9-a83c876306c8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_agenda_scatter_with_large_jitter __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-75ac8f53-b090-45c1-a366-04483f1ee151:runs:019ce307-fa46-73f7-82de-027cf536eff5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_agenda_scatter_single_task _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-95e80903-9bb6-42c1-a714-428b75a17f60:runs:019ce307-fc34-75a4-97b1-fb8472018aac']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_agenda_scatter_heterogeneous_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fc186455-1644-414d-9677-fb135959b38b:runs:019ce307-ff7d-72d7-889b-04a78a0b7c66']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_agenda_scatter_preserves_order ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c52eda57-4281-405d-b803-0ea54ba493b6:runs:019ce308-01a8-71aa-9075-52315f35b0fc']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________________ ERROR at teardown of test_agenda_reusability _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f6938317-5d8b-49bb-8443-59e2b9b007f6:runs:019ce308-03cb-72e5-977f-eeeb04c4d695']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_agenda_scatter_with_task_by_name __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-30a72178-4930-4fba-9745-db4a62ce2ae3:runs:019ce308-0689-730b-b37f-73138cee1157']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_agenda_scatter_partial_scheduling_behavior _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0ba06c0a-12c5-4eeb-bcaf-024d5cc98dba:runs:019ce308-094c-764c-a0f3-1cd44d4fb694']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_agenda_scatter_auto_registers_unregistered_functions _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4285f961-f27e-498a-b422-30f9de550ca4:runs:019ce308-0b48-73b3-aa40-586a2ed344e6']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_cancel_running_task _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fbb36432-af8f-4e75-a42c-1d5977a2121f:runs:019ce308-0e59-70e6-9ba5-ee8a3696956b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_cancel_running_task_state ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-756a02e6-2d9f-46c3-90b5-f3a8eb6c8867:runs:019ce308-1055-70b8-9b09-bcb6445c431e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_cancel_running_task_with_cleanup __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b021ad83-cf17-408d-935d-7ed46613ac0d:runs:019ce308-1266-730b-850a-ea6db60a0ec2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_cancel_task_that_ignores_cancellation ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-aa4200e0-d280-4c44-b3be-6baee4fd03da:runs:019ce308-1458-73d0-955c-daa10d3b914f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_cancel_already_completed_is_noop __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b30a6ccc-29e8-421d-a0c3-0f8b2f84c013:runs:019ce308-1659-7488-8546-83f13d2d86fb']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_cancel_publishes_state_event ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d411a44a-4481-42d2-81b8-889ad046d0d0:runs:019ce308-185f-7655-a64e-8619c154d615']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_cancel_only_affects_running_worker _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8e04c96d-bbb6-42dd-8405-6b6bb04efc6d:runs:019ce308-1aa8-772f-b666-c032f2e8eb49']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_cancelled_task_with_retry_does_not_retry ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0a1a944e-b5b0-4869-ab6c-653cc3109b40:runs:019ce308-1e59-7713-9356-b9e53c693ee6']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_cancelled_perpetual_task_does_not_perpetuate ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c5ae083d-cd42-4d76-a975-968228a6f3f5:runs:019ce308-2058-75e3-9107-467767b43358']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_cancel_running_task_with_timeout __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-898f6526-36f8-4a8d-897f-29e7cf7f6f5f:runs:019ce308-2258-76ee-b35c-b523dfdff02c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_get_result_raises_execution_cancelled_for_cancelled_task _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-93c85fab-0bd3-4b97-a17f-bdb77bff8e38:runs:019ce308-246b-740a-82e8-529ce48bbd34']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_sync_function_dependency ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-301fe1c6-c0f2-402f-bdc7-3712891cba8d:runs:019ce308-2699-7153-9759-d4ac1d37a0e9']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_sync_context_manager_dependency ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6f8d3624-5a3a-4118-9c9e-91ece7175f35:runs:019ce308-289d-703d-b378-e03f5aa1a89a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_mixed_sync_and_async_dependencies __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c554d747-a993-4580-b742-cac6698f1e6f:runs:019ce308-2aa5-76cf-979c-b67b567d7a21']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_nested_sync_dependencies ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4a838b1b-ae4c-4934-8c5b-dbb87c9b44e1:runs:019ce308-2cba-7191-8e93-aca896388ee5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_sync_dependency_with_docket_context _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4dbdf753-65e6-4ca9-bb2a-8f1432489160:runs:019ce308-2f0e-726f-a868-a994a37c6763']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_sync_context_manager_cleanup_on_exception ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-db39f099-d67e-4a42-bbd9-326afae8ad61:runs:019ce308-311f-7547-b695-a8fcec8920bd']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_sync_dependency_caching _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-58167665-9dc0-4508-af0c-83e573e766e7:runs:019ce308-332b-711c-a185-30791d840889']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_mixed_nested_dependencies ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b0e9857d-afd8-4996-9258-abc2346de8ff:runs:019ce308-3541-715f-a79d-369357f9f53d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_contextvar_isolation_between_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3e77e7d6-b0e7-4e5b-b757-f507471d3541:runs:019ce308-3761-7276-8074-ce9f118b3a87']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_contextvar_cleanup_after_task ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f55295d2-27ba-410d-9dc9-f6425595444f:runs:019ce308-3985-74af-8f7e-9d8b4960c0cc']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_dependency_cache_isolated_between_tasks _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a1ef9e25-44d4-400f-b538-d006f5e6f5a4:runs:019ce308-3bcb-707c-89df-ac8abe224336']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_async_exit_stack_cleanup ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fc117488-5e38-412a-8cec-c2a8c1314dd0:runs:019ce308-3de4-7556-a018-4f6945cc6002']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_dependencies_may_be_duplicated ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a63a04e6-21aa-4404-a21a-7486f458e808:runs:019ce308-4232-7751-a9ec-979aa0d139af']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_users_can_provide_dependencies_directly _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e9a9b417-6a7e-49cc-aa46-ebae8edd8630:runs:019ce308-4450-7174-92cd-51cac4c57b7d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_user_provide_retries_are_used ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-85f45435-bb69-40bb-96de-2165d1ef15eb:runs:019ce308-46b0-73da-8879-84a34f8937d0']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_user_can_request_a_retry_after_a_delay[Retry] ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-321b7639-7b72-4132-ac21-9bae026eb415:runs:019ce308-48ce-71e2-8527-40408f264d15']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_user_can_request_a_retry_after_a_delay[ExponentialRetry] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5156c3fb-a79d-49a6-8bca-b21fbe7ddb56:runs:019ce308-4aee-705b-99a4-f93be1d20576']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_retry_in_is_backwards_compatible_alias_for_after __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9981a30a-93e2-4cca-8405-cc026bcab851:runs:019ce308-4d11-76bd-b5d3-784b19b1274e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_user_can_request_a_retry_at_a_specific_time[Retry] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2a91fb49-240e-4f6b-8695-05d18f6a901f:runs:019ce308-4f34-73ed-87e1-d8127bdcc2c2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_user_can_request_a_retry_at_a_specific_time[ExponentialRetry] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e14bc529-bb58-4d85-8b37-4b6115c40674:runs:019ce308-5187-7137-bb52-0ff4af0e9f56']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_user_can_request_a_retry_at_a_specific_time_in_the_past _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e85579a6-1f47-4d87-8923-119850429c5c:runs:019ce308-53a5-7003-8a9d-ab3fb39bba8e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_dependencies_error_for_missing_task_argument ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2274e71a-43ab-4c7f-a03e-6eceef737255:runs:019ce308-55c5-7331-8303-8e4e958fda76']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_a_task_argument_cannot_ask_for_itself ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4bc7f1a1-fb57-4a67-b14c-84c73f16f914:runs:019ce308-57e9-762a-baac-447cc00abe32']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_clear_with_immediate_tasks _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-99675345-c6b4-480a-82ae-e2ec7b3f1593:runs:019ce308-600b-7111-b327-c58342f0a38d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_clear_with_scheduled_tasks _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4f7faa76-5f52-47b3-a2ae-955a03c8a115:runs:019ce308-626e-779c-b509-997118d5dad8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_clear_with_mixed_tasks _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-75207d27-064e-4b7d-aabe-e7eb905e3db1:runs:019ce308-6499-7124-ab3e-d9c2c2f201ca']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_clear_with_parked_tasks _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2f437912-3e83-4a07-9cb0-b1fb6f0a9eff:runs:task1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_clear_returns_total_count ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-870ff481-f29d-4253-aae7-95140e3874fb:runs:019ce308-6a26-7226-8325-571032b0780b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_clear_no_redis_key_leaks ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0108b231-11fa-4c60-8ff3-88144a0c3202:runs:019ce308-6c66-718e-b48e-298c54ccb554']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_docket_schedule_method_with_immediate_task _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7a1b4586-7a4c-4118-83f9-cd170e8105cb:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_get_execution_for_scheduled_task __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5db165f0-aaa6-411d-b234-67c140f6c4b5:runs:scheduled-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_get_execution_for_queued_task ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2f08e22f-c4e9-4f4d-897f-855b43a86c2e:runs:immediate-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_get_execution_function_not_registered ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-22c5f764-eeeb-417a-8d75-59b9fea1760c:runs:task-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_get_execution_with_complex_args ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fee763f3-b383-4a24-9601-73bfab5ddd38:runs:complex-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_get_execution_claim_check_pattern __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-842d4de1-ebea-4179-8d2d-395206b9c318:runs:claim-check-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_cancelled_state_creates_tombstone __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5c6cb054-19a5-4482-97d1-1e4db606c6ee:runs:task-to-cancel']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_cancelled_state_respects_ttl ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b3ccc40e-497a-442f-bcd1-9b7efa5219c3:runs:ttl-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_get_execution_after_cancel _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a08e0341-ba58-497d-b9e5-8a191d83d146:runs:cancelled-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_replace_does_not_set_cancelled_state ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-61dd0e97-df3b-48f3-95ae-74aa6e727c65:runs:replace-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_cancellation_idempotent_with_tombstone _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5d9bc9ad-099f-465e-8ee2-8e260516c5f0:runs:idempotent-task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_schedule_task_by_alias _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bbafd8a5-e4bf-409f-b8bc-7d8a28698bd9:runs:019ce308-cea1-7308-83f7-29c130c65064']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_run_state_scheduled _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-606f0b47-da00-4c52-9985-4841c6c151a3:runs:019ce308-e307-7403-ac58-23c0c4678f0a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_run_state_pending_to_running ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2794c536-c834-4af6-b8d8-33b5665326ab:runs:019ce308-e575-7042-a8df-2a8dca0aee55']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_run_state_completed_on_success ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7eeaffbb-48b9-4df3-bf43-6389ddc77ea0:runs:019ce308-e827-70e3-b77b-66544368f076']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_run_state_failed_on_exception ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7431d88a-0a3f-4219-9bf2-5ab02766d49b:runs:019ce308-ea93-700a-83fa-ec8c4c853121']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_run_state_ttl_after_completion ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1c944e54-b805-4adc-b28e-fca1249d69d1:runs:019ce308-ed03-7677-a6f2-2bf1e223de11']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_full_lifecycle_integration _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-529a98cb-12b7-4a72-9714-f3ded891a372:runs:019ce308-f1a0-73dd-ac94-bb98dbfd447b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_run_add_returns_run_instance ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f68966b1-960a-4564-abf8-828d275e2c5c:runs:019ce308-f422-71f2-b2f3-6b75ba49b26b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_error_message_stored_on_failure ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f661e20a-cad2-4d70-a0f9-bd55bf821e4b:runs:019ce308-f6e4-709e-ab30-e627d4f47005']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_mark_as_failed_without_error_message ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-456d512a-c61d-4e8e-80ce-0ca7cf9d0711:progress:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_default_fallback_task_logs_and_acks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ec14e25c-0f71-4aa4-8918-e31c76af7a27:runs:019ce309-00c4-758d-96a2-453e7f9135ee']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_custom_fallback_receives_original_args_kwargs ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7bfd1ed7-f147-4118-9c4a-c5b2e766a8cc:runs:019ce309-033f-70cb-9bf5-4bd83af97d23']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_fallback_can_access_function_name __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2d6f2934-22ba-44e9-82e4-630d3d20a49d:runs:019ce309-05b5-757c-ba86-777319140ee5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_fallback_dependency_injection ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6a6f896d-31a5-4905-ad06-7a771d8ba1cd:runs:019ce309-083b-71dc-a4cd-8015ea89ced8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_fallback_custom_user_dependency ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0c583b57-8798-436f-824d-9a008ed1bc3d:runs:019ce309-0b05-7080-809f-b720009ff281']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_fallback_return_completes_task ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9bf31798-1baa-401d-8a27-c0a090b0972a:runs:019ce309-0d7f-7511-ad04-4b1953c382ab']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_fallback_exception_triggers_retry __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b784b8ae-6821-466a-8635-07321898a091:runs:019ce309-0ffb-7377-8981-1a7fd1c60fe5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_execution_function_name_matches_for_known_tasks ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4483700e-97ec-49f6-aace-04b8760a7ec8:runs:019ce309-127d-7560-859d-ac6c17a79619']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_retrying_task_is_not_marked_as_failed ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e4bead4c-6dce-4ece-9816-4e5855550e51:runs:019ce309-14f5-76ac-840a-d25495e1e9a1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_exhausted_retries_marks_task_as_failed _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-cc6f2a85-d784-4b25-aff7-024b6774559c:runs:019ce309-1773-750c-a35f-8b4d54a12ff4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_failed_perpetual_task_is_rescheduled ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d859cdb7-6186-445a-87f0-6f71fb9024aa:runs:019ce309-1a4f-74e5-b351-726601b25e64']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_retry_and_perpetual_work_together __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-72c956f9-aaf8-490e-a830-4fc25d3d41f9:runs:019ce309-1cc3-75b5-9187-c73908429c58']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_perpetual_after_is_respected_on_failure _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c7098ee0-fdac-41a3-b9db-2b861b90e339:runs:019ce309-1f4a-7020-8862-66c80da2e20a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_leak_detection_catches_keys_without_ttl _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5fda3c84-ed64-4417-be09-7b034969a5be:runs:019ce309-21f7-7390-a92d-affe4f523ff2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____________ ERROR at teardown of test_permanent_keys_are_exempt ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-03b6270a-136b-4218-b959-71104b8ee40c:runs:019ce309-24a5-706d-abed-fe647630ca62']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_exemption_mechanism _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3b0e579f-e6cf-4c94-9fe2-87c92b1db2fd:runs:019ce309-2744-7654-a9b4-9f330b57f35a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_multiple_exemptions _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e26a9af1-6e53-4b2f-a42c-f8d2708cdf1f:runs:019ce309-2a2a-767c-921d-fd0ed275e4ea']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_worker_task_sets_are_exempt _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-572f0af3-3787-40bb-94e4-8a118bbb1039:runs:019ce309-2cbf-77ab-980c-20d32d72472d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________________ ERROR at teardown of test_queue_is_cleaned_up _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-069828cf-ac87-449f-afc8-bb4b374bbe8a:runs:019ce309-2f5d-75ed-8cb0-7e260d35a289']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=0] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:runs:perpetual-race-test']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=60s] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6b77ac02-67d5-4b82-896b-ec355b197373:runs:perpetual-race-test']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_is_superseded_after_replace[execution_ttl=0] ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-34e4d7e5-fe3f-4079-a5a7-a8a889598482:runs:gen-test']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_is_superseded_after_replace[execution_ttl=60s] ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-03bd87e5-2531-49e4-bbc9-b3bfffc6873f:runs:gen-test']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_superseded_message_skipped_before_execution[execution_ttl=0] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-65c73731-330d-4759-8855-0a22948b6d2a:runs:head-check']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_superseded_message_skipped_before_execution[execution_ttl=60s] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d1c00154-8dc2-4148-beb2-36e40ddcdf18:runs:head-check']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_old_message_without_generation_runs_normally[execution_ttl=0] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-076f77d8-3442-4910-8f76-a525572c4101:progress:old-to-new', 'test-docket-076f77d8-3442-4910-8f76-a525572c4101:runs:old-to-new']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* legacy_task() DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Scheduling due tasks INFO:docket.worker:↪ [ 10ms] legacy_task(){old-to-new} ERROR:docket.worker:↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.NoScriptError: No matching script. Please use EVAL. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5578, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * legacy_task() DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:678 Scheduling due tasks INFO docket.worker:worker.py:828 ↪ [ 10ms] legacy_task(){old-to-new} ERROR docket.worker:worker.py:975 ↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.NoScriptError: No matching script. Please use EVAL. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5578, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk _ ERROR at teardown of test_old_message_without_generation_runs_normally[execution_ttl=60s] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-724dd26c-8345-4e83-9ed0-1b65c391c13a:progress:old-to-new', 'test-docket-724dd26c-8345-4e83-9ed0-1b65c391c13a:runs:old-to-new']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* legacy_task() DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Scheduling due tasks INFO:docket.worker:↪ [ 9ms] legacy_task(){old-to-new} ERROR:docket.worker:↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * legacy_task() DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:678 Scheduling due tasks INFO docket.worker:worker.py:828 ↪ [ 9ms] legacy_task(){old-to-new} ERROR docket.worker:worker.py:975 ↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk _ ERROR at teardown of test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=0] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1e35e812-2c00-430b-8139-e8e4701f3818:runs:new-old-new']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=60s] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9c3cb37d-763b-4172-8ed0-9f38b3b43b0d:runs:new-old-new']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_replace_skips_stale_stream_message[execution_ttl=0] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-840391af-2310-4e35-afb2-de4611f99e4e:runs:replace-race']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_replace_skips_stale_stream_message[execution_ttl=60s] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bd785855-b69b-4496-bf8c-edad7bfef148:runs:replace-race']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_perpetual_successor_survives_mark_as_terminal[execution_ttl=0] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-83efe645-6293-4e9b-a63a-6ddc26554daf:runs:successor-survives']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_perpetual_successor_survives_mark_as_terminal[execution_ttl=60s] _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0727c09f-2b8c-4acf-80db-0b648ae52baa:runs:successor-survives']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_perpetual_task_state_isolation ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2cd43bed-d697-49f0-94cf-6be9bb470977:runs:019ce309-5164-7336-aa9d-5c3869538141']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_rapid_perpetual_tasks_no_conflicts _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ec82c347-3c9a-4704-8821-f05977264c4a:runs:019ce309-5428-76de-8b75-80bed378bd97']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_perpetual_same_key_no_state_accumulation ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5cdb69a0-f8fa-42d4-9ee0-5ef336e57360:runs:019ce309-55ae-756b-9111-4351c3ddfd08']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_perpetual_task_state_transitions_with_same_key ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-58efbd0c-298c-4426-b1d7-963971f7a899:runs:019ce309-5738-71c3-af4b-a086ca4ad9d4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________________ ERROR at teardown of test_progress_create ___________________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4ebb942a-0822-4b9b-ab48-d347bcb2628a:progress:test-key', 'test-docket-4ebb942a-0822-4b9b-ab48-d347bcb2628a:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_basics.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ _________________ ERROR at teardown of test_progress_set_total _________________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d926b2a4-9956-43ec-ad7d-c6a40ada9417:progress:test-key', 'test-docket-d926b2a4-9956-43ec-ad7d-c6a40ada9417:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_basics.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ _________________ ERROR at teardown of test_progress_increment _________________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c6540751-ebd9-42ff-8003-382653eee178:progress:test-key', 'test-docket-c6540751-ebd9-42ff-8003-382653eee178:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_basics.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ________________ ERROR at teardown of test_progress_set_message ________________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9a85c62b-d845-4b55-9db1-b17c6ad71cfd:progress:test-key', 'test-docket-9a85c62b-d845-4b55-9db1-b17c6ad71cfd:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_basics.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ___________ ERROR at teardown of test_progress_dependency_injection ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-fb51ba7d-10fd-498d-9218-8da6c8f2b1ec:runs:019ce309-5ed2-77c7-b852-dff9f13824f6']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_progress_deleted_on_completion ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b43b7453-1934-45cf-afd0-f78ac1d53681:runs:019ce309-605b-75ef-bae0-a634b8e96c77']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_progress_with_multiple_increments __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f282e7c1-d571-4dc0-8041-82e729b93ef3:runs:019ce309-61da-7497-9758-bd2694852c0d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_progress_without_total _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-dc67737b-cb7e-49fd-ab3c-7f5a7c32e542:runs:019ce309-635c-73ad-adad-a01735b7e2e3']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_concurrent_progress_updates _____________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-671e4a00-5ecb-4ed5-969b-1df859dc46a4:progress:test-key', 'test-docket-671e4a00-5ecb-4ed5-969b-1df859dc46a4:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_basics.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ______________ ERROR at teardown of test_progress_publish_events _______________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7c47cfaf-e860-49bd-9d17-3d1f8241375b:progress:test-key', 'test-docket-7c47cfaf-e860-49bd-9d17-3d1f8241375b:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_pubsub.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ________________ ERROR at teardown of test_state_publish_events ________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5bb0c087-ba44-4aee-9609-256fbc629e83:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_run_subscribe_both_state_and_progress ________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f4e1ed9a-1fa3-4f79-b4a7-34c3d21645a3:progress:test-key', 'test-docket-f4e1ed9a-1fa3-4f79-b4a7-34c3d21645a3:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_pubsub.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ __________ ERROR at teardown of test_completed_state_publishes_event ___________ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e650b0dc-c1e1-4416-a969-7169dc5f1050:progress:test-key', 'test-docket-e650b0dc-c1e1-4416-a969-7169dc5f1050:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_pubsub.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ______ ERROR at teardown of test_failed_state_publishes_event_with_error _______ + Exception Group Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 344, in from_call | result: TResult | None = func() | ~~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 246, in | lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise | ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_hooks.py", line 512, in __call__ | return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) | ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_manager.py", line 120, in _hookexec | return self._inner_hookexec(hook_name, methods, kwargs, firstresult) | ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 167, in _multicall | raise exception | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/logging.py", line 858, in pytest_runtest_teardown | yield | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 139, in _multicall | teardown.throw(exception) | ~~~~~~~~~~~~~~^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/capture.py", line 905, in pytest_runtest_teardown | return (yield) | ^^^^^ | File "/usr/lib/python3.14/site-packages/pluggy/_callers.py", line 121, in _multicall | res = hook_impl.function(*args) | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 193, in pytest_runtest_teardown | item.session._setupstate.teardown_exact(nextitem) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 557, in teardown_exact | raise exceptions[0] | ExceptionGroup: errors while tearing down (2 sub-exceptions) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/conftest.py", line 296, in key_leak_checker | await checker.verify_remaining_keys_have_ttl() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/_key_leak_checker.py", line 118, in verify_remaining_keys_have_ttl | assert not keys_without_ttl, ( | ^^^^^^^^^^^^^^^^^^^^ | AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-279034c4-ef22-480b-b750-a66df8760261:progress:test-key', 'test-docket-279034c4-ef22-480b-b750-a66df8760261:runs:test-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). +---------------- 2 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/_pytest/runner.py", line 546, in teardown_exact | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1069, in finish | raise exceptions[0] | File "/usr/lib/python3.14/site-packages/_pytest/fixtures.py", line 1058, in finish | fin() | ~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 289, in finalizer | runner.run(async_finalizer(), context=context) | ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/runners.py", line 127, in run | return self._loop.run_until_complete(task) | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ | File "/usr/lib64/python3.14/asyncio/base_events.py", line 719, in run_until_complete | return future.result() | ~~~~~~~~~~~~~^^ | File "/usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py", line 281, in async_finalizer | await gen_obj.__anext__() # type: ignore[union-attr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_progress_pubsub.py", line 23, in execution | await execution.mark_as_completed() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ _____ ERROR at teardown of test_end_to_end_progress_monitoring_with_worker _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-828b4c2d-901f-4ea9-90d9-c0f85b6f7135:runs:019ce309-6ca3-702a-a2fc-0f588358127d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_end_to_end_failed_task_monitoring __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-54c83d6f-79ab-4058-9e80-b55c0f8f2adf:runs:019ce309-6e8e-73e7-8aeb-e7d706dc5089']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_subscribing_to_completed_execution _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-82de1f6e-a9f9-4726-9034-3d33512d1e40:runs:already-done:123']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_redelivery_from_abandoned_worker __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-146e540c-0673-4388-9095-a2fff9a5c8a0:runs:019ce309-71b1-732a-9542-ed6e18795a49']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_long_running_task_not_duplicated __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a4add5bb-8660-4591-acc8-c2ed17e2f871:runs:slow-1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_retry_with_long_running_task ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-64da9ca4-3561-4f42-87e2-b77a3aa4f778:runs:flaky']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_multiple_workers_no_duplicate_execution _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-97ddd243-4977-48e3-bdce-1876c429f2b3:runs:task-0']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_perpetual_task_with_lease_renewal __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c1e23d20-e9f1-451b-b927-e25ecbaaa651:runs:perpetual']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_user_timeout_longer_than_redelivery _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-33132519-2b76-4120-aba8-1405dae7fae8:runs:019ce309-7a05-75c4-91b3-883c539a8f27']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_workers_with_same_redelivery_timeout ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ad5e4ff9-4f06-49f6-9eec-3e952317be2b:runs:task-0']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_worker_joining_doesnt_steal_renewed_lease ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-87d93ea6-0743-4024-9ff1-3e6cc32bca87:runs:task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_lease_renewal_recovers_from_redis_error _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8fc79f60-1740-4445-89b7-ce04811c3334:runs:019ce309-7ea9-7327-b434-0b6dd776b83b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_get_result_waits_for_completion ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bcd0e6fa-c0e2-4297-9145-cce1062f48f8:runs:019ce309-8128-7294-9f47-eea44956f696']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________________ ERROR at teardown of test_get_result_timeout _________________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-14931a64-db42-47ca-9475-c0efc7a1158f:runs:019ce309-8316-7721-9fce-eee2b27f0ec8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_multiple_concurrent_get_result_calls ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-72202011-76e5-47c4-bf2a-80848574720e:runs:019ce309-84b9-71de-a218-3f813f01099e']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_get_result_on_already_completed_task ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-42f9e35a-b4e9-4e2f-8acb-776da12c794b:runs:019ce309-8699-748d-a898-faafb4ed4300']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_get_result_on_already_failed_task __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bf0f8fc0-a9f4-4ded-86a3-642515d2553f:runs:019ce309-8844-7288-9772-78e227353a17']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_get_result_with_malformed_result_data ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-00d4fcda-7bfb-4273-bb55-d75a58042c6c:runs:019ce309-8b17-72ab-9611-04fe7e5bb3ff']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_get_result_failed_task_with_missing_exception_data _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d05ee265-dd6a-49a2-af14-f65b39f64794:runs:019ce309-8ca3-700f-9a9a-b12166ebd6c4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_get_result_with_timeout_timedelta __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6cc55069-4ab1-4dd6-9dfc-22b4d5f4eed0:runs:019ce309-8e5b-779f-ae22-cc7910909888']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_get_result_with_deadline_datetime __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e75bdd93-5dea-4caf-a4e7-61979141755b:runs:019ce309-904b-77a9-a703-4d556386c172']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_get_result_timeout_on_pending_task _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-0c33def1-f903-49d6-ac47-381937b4f78b:runs:019ce309-9277-756b-b2ba-38cb44e2de7d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_result_storage_for_int_return ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-307d456f-6174-4923-8d6f-71657cc41b50:runs:019ce309-940d-7743-8f09-e197213e1f80']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_result_storage_for_str_return ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e316b734-36a6-424e-abe8-bd5a17f43199:runs:019ce309-95a6-73f6-98ee-03827fca5636']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_result_storage_for_dict_return ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7e5d7b01-536f-43a7-9960-00da78d2312a:runs:019ce309-9748-76d7-87ca-b417f616cca2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_result_storage_for_object_return __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b7114ddf-0028-409d-b82b-8b894d68583c:runs:019ce309-98dc-75ea-85cb-2466801b0b5d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_no_storage_for_none_annotated_task _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-61133b0c-6b4c-49c9-b066-4a72a9b2e35c:runs:019ce309-9a87-7792-b121-c677f1c392c2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_no_storage_for_runtime_none _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8277ff60-8578-4e01-8a27-e750c7b146cc:runs:019ce309-9c81-722a-90ca-ed9ed439c8c9']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_exception_storage_and_retrieval ___________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-93cffcb0-7555-4d4a-b871-ead6628bf5c5:runs:019ce309-9e1c-73b8-ba62-efcf4f1f7e59']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_result_key_stored_in_execution_record ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4e629066-7ace-4675-b919-8abf4b8f767f:runs:019ce309-9fbd-72c6-8312-4e731d34b10a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_strike_incomparable_values[>-42-string] _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1941db54-8d74-4020-884b-0a8aec508611:runs:019ce309-ccc1-726d-98a7-93c62378e499']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.GREATER_THAN 'string' Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 429, in _is_match return value > strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '>' not supported between instances of 'str' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.GREATER_THAN 'string' Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 429, in _is_match return value > strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '>' not supported between instances of 'str' and 'int' ______ ERROR at teardown of test_strike_incomparable_values[<-string-42] _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-40f8f52c-cabb-452a-bfcc-d862159e49ef:runs:019ce309-ce5e-7231-9808-5590be971455']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 'string' Operator.LESS_THAN 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'int' and 'str' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 'string' Operator.LESS_THAN 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'int' and 'str' _______ ERROR at teardown of test_strike_incomparable_values[>=-None-42] _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b4aa0b4f-f0ee-4136-b0a4-8869bf519dfe:runs:019ce309-cffb-7238-b560-c24aeb3c13f7']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: None Operator.GREATER_THAN_OR_EQUAL 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 431, in _is_match return value >= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '>=' not supported between instances of 'int' and 'NoneType' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: None Operator.GREATER_THAN_OR_EQUAL 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 431, in _is_match return value >= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '>=' not supported between instances of 'int' and 'NoneType' _______ ERROR at teardown of test_strike_incomparable_values[<=-42-None] _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ef273f00-dbb5-442a-bb60-f6c0b3df6c45:runs:019ce309-d1b7-7727-87ce-57437637d8e5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL None Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'NoneType' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL None Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'NoneType' and 'int' ______ ERROR at teardown of test_strike_incomparable_values[>-value4-42] _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4a89691a-0821-4dc2-bb91-4eee8bc1de9a:runs:019ce309-d3ac-71fc-815f-e06351fe7041']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: Operator.GREATER_THAN {} ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:457 Incompatible type for strike condition: Operator.GREATER_THAN {} --------------------------- Captured stderr teardown --------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: > {} ---------------------------- Captured log teardown ----------------------------- WARNING docket.strikelist:strikelist.py:457 Incompatible type for strike condition: > {} ____ ERROR at teardown of test_strike_incomparable_values[<-42-test_value5] ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8a58c776-de06-4fb2-82d7-28a8d597c9ed:runs:019ce309-d543-7732-b1b8-461e39ac4531']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.LESS_THAN {} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'dict' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.LESS_THAN {} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'dict' and 'int' ______ ERROR at teardown of test_strike_incomparable_values[>=-value6-42] ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-4fead7d2-66da-4693-8465-4eaf6577d843:runs:019ce309-d6dc-7454-a943-e0a29de88772']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: Operator.GREATER_THAN_OR_EQUAL [] ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:457 Incompatible type for strike condition: Operator.GREATER_THAN_OR_EQUAL [] --------------------------- Captured stderr teardown --------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: >= [] ---------------------------- Captured log teardown ----------------------------- WARNING docket.strikelist:strikelist.py:457 Incompatible type for strike condition: >= [] ___ ERROR at teardown of test_strike_incomparable_values[<=-42-test_value7] ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-8163de06-e394-4ff6-ae04-8627fee0f333:runs:019ce309-d881-77a3-a34e-5a83bf61818b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL [] Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'list' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL [] Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'list' and 'int' ______ ERROR at teardown of test_restored_automatic_perpetual_does_start _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d93cd2af-e0b7-4ba8-bb8b-705438b4ec9b:runs:my_restored_automatic_task']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.strikelist:Striking 'my_restored_automatic_task(* == *)' INFO:docket.strikelist:Restoring 'my_restored_automatic_task(* == *)' INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* my_restored_automatic_task(...) INFO:docket.strikelist:Striking 'my_restored_automatic_task(* == *)' INFO:docket.strikelist:Restoring 'my_restored_automatic_task(* == *)' ------------------------------ Captured log call ------------------------------- INFO docket.strikelist:strikelist.py:607 Striking 'my_restored_automatic_task(* == *)' INFO docket.strikelist:strikelist.py:607 Restoring 'my_restored_automatic_task(* == *)' INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * my_restored_automatic_task(...) INFO docket.strikelist:strikelist.py:607 Striking 'my_restored_automatic_task(* == *)' INFO docket.strikelist:strikelist.py:607 Restoring 'my_restored_automatic_task(* == *)' _ ERROR at teardown of test_assert_task_scheduled_finds_task_by_function_only __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-61dd5b06-e06f-4027-8c74-7d134c49af02:runs:019ce309-dda9-754a-bce7-3feb23036e02']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_finds_task_by_function_and_args _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-dcc96fe1-2ee1-479a-ac3b-2d7ee56b9134:runs:019ce309-df97-71ea-87cf-2782e89a670a']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_finds_task_by_function_and_kwargs _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-486f8826-9ca1-457c-a041-004f57639d23:runs:019ce309-e13a-703a-956e-d9d5972bc19b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_finds_task_by_function_args_and_kwargs _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2cc0534f-e5be-456c-81c3-eabe8b7848eb:runs:019ce309-e2dc-7688-b821-1d85934f6193']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_assert_task_scheduled_finds_task_by_key _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-82c4f4a8-0c90-4a6b-acd7-0d0719691b4d:runs:my-task-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_assert_task_scheduled_works_with_function_name ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f3462674-153b-453e-becc-2f9ad2b088ef:runs:019ce309-e626-7442-a13b-33dcba1cd6ee']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_succeeds_with_multiple_matching_tasks _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b8e10e5e-e241-4489-aab4-17a1cc7a33c5:runs:019ce309-e7cf-72cc-81c9-f44e2cf7766f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_assert_task_scheduled_fails_when_task_not_found ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-bc766993-a361-4422-8078-daf766d8091d:runs:019ce309-e983-7115-991f-f9e2339db653']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_assert_task_scheduled_fails_when_args_dont_match __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a4dcd4c3-c5b0-47c7-96dd-041b22ea5eb8:runs:019ce309-eb7c-727f-97cb-c9de856198b7']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_fails_when_kwargs_dont_match _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e0cf70ee-ca27-414d-9448-1c332f7b1f45:runs:019ce309-ed26-7485-9197-7d42142676e7']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_finds_scheduled_future_task __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b91785fb-eb81-40d0-aa18-05a3c90d7781:runs:019ce309-eecc-74a4-a8b3-6aa8552963e4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_not_scheduled_succeeds_when_different_task _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-790c0a6f-1920-46f5-9aa7-b829fc1f4251:runs:019ce309-f11b-7376-84bc-cf0ea35c4067']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_assert_task_not_scheduled_fails_when_task_exists __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b3df47d9-3d6b-4393-bde0-1664002539b6:runs:019ce309-f2c8-7443-8e0d-dc9f6927dad4']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____ ERROR at teardown of test_assert_task_not_scheduled_with_specific_args ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-57816f4d-d9e5-4a95-827e-2b2610d44f1d:runs:019ce309-f47a-726e-b525-28534e511618']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_assert_task_count_all_tasks _____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-1f889d7b-7de7-453e-af81-8e55dc63a8ab:runs:019ce309-f681-75aa-8d04-1f6df5973404']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_assert_task_count_for_specific_function _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d96e1f15-3331-47e2-b5f1-6e3262515c09:runs:019ce309-f830-7355-85d9-f89607a6656c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_assert_task_count_fails_with_wrong_count ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3ec0b894-2592-4610-b7fe-452e3de684e7:runs:019ce309-fa8c-70ff-938d-0766fe01f72d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_assert_task_count_with_function_name ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9904bba6-d4d7-461f-9698-6afa96a02b07:runs:019ce309-fc3c-73a0-9b93-0a019d41664d']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_assert_no_tasks_fails_when_tasks_present ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-9f5f0f94-9ff9-45fb-9b7c-fbd05f6eb645:runs:019ce309-fe97-7125-afb4-68d33a0a9375']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_assert_no_tasks_after_tasks_complete ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c322fbf4-0e9c-46c6-b695-34fbb746d828:runs:019ce30a-004d-777d-b7f1-4e3bca10f47b']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_assert_task_scheduled_partial_kwargs_match _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-16fcfde7-84d7-4a24-8140-62b0c4e2e8f1:runs:019ce30a-0210-7656-9720-f5c1b932f167']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_count_includes_future_and_immediate_tasks _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c7b1e8f7-8463-405c-94a9-f41b4d4b376e:runs:019ce30a-0416-7434-af5d-6b05d4ecee7c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_assert_task_scheduled_fails_when_key_doesnt_match __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-2e162724-9129-4366-88a6-e0dc0150c918:runs:task-1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_redis_key_cleanup_successful_task __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b4bea7d0-4c16-45c0-964d-5796b6fa44a0:runs:019ce30a-1cee-702d-8b48-ee18104341a0']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___________ ERROR at teardown of test_redis_key_cleanup_failed_task ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-a811ad24-5cdc-47c8-9c09-621e91c523d3:runs:019ce30a-1ea5-72d9-ae2a-1343c54d0bd8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __________ ERROR at teardown of test_redis_key_cleanup_cancelled_task __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-773f3a6c-c0a5-41dd-b9a1-b744596328d4:runs:019ce30a-2060-7517-9fb1-10e145c3bfbf']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ____________ ERROR at teardown of test_worker_acknowledges_messages ____________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e54501a9-394c-4ae7-964b-c613fb1615e5:runs:019ce30a-2940-776b-be98-99ac75e31a46']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______________ ERROR at teardown of test_two_workers_split_work _______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-94b16a6c-19fe-4f99-be03-2300976f5a6a:runs:019ce30a-2af1-744a-b679-6e3864876e60']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_worker_reconnects_when_connection_is_lost ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-40b8c2a0-29f8-4b01-b8fd-6ac08f2cf8b2:runs:019ce30a-2ca6-76e8-aa3c-f79bfd250b09']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _________ ERROR at teardown of test_worker_respects_concurrency_limit __________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-f578ac2a-4fb8-4715-853c-18972ecbbe63:runs:019ce30a-2e63-71e9-92f8-eacbe0b78ec1']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_handles_unregistered_task_execution_on_initial_delivery _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e4d34c5a-9309-4a6b-90db-282c96ce4133:runs:019ce30a-301a-7764-961d-a479beda9c24']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_handles_unregistered_task_execution_on_redelivery _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-7553039d-9531-4583-ac4b-fa037dd12e58:runs:019ce30a-31e4-755a-83d1-618454938d9f']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError __ ERROR at teardown of test_worker_concurrency_cleanup_without_dependencies ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-baac79e8-c6e7-4780-9ac7-7dd293473b2d:runs:019ce30a-5196-718d-a6cf-afc7378a4eaa']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_worker_concurrency_no_limit_with_custom_docket ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6e44d46c-ff70-4e19-8935-862221ba2b8f:runs:019ce30a-5352-7664-93c9-4885cce01221']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_worker_exception_before_dependencies ________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-b0e69728-626d-4cda-96fc-228ac0efeb68:runs:019ce30a-550c-736c-bf97-572eeba55bca']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_invariant_tasks_by_key_empty_after_completion ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e4a67f0e-f7cd-4e5e-aff9-4b086eb5ef82:runs:019ce30a-56c7-7461-bc59-f552b487cec5']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_invariant_tasks_by_key_no_growth_over_batches ____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-65b2a2bf-56a9-48ec-9213-38b8a976b435:runs:019ce30a-587e-74e1-ab94-f4819a292f51']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_invariant_execution_counts_empty_after_completion __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-10db31fb-0202-4b47-8faf-dcf979ec96fa:runs:019ce30a-5a44-71e9-acd0-b24019ca17ab']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_invariant_execution_counts_cleared_after_run_at_most _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-e55ade8e-fa70-4850-bd72-e7292a772597:runs:test-perpetual']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_invariant_cleanup_after_task_exceptions _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-cca7e35f-3cb9-4990-916c-a05238bcc287:runs:019ce30a-5e98-7784-be78-36f8472776a3']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_invariant_cleanup_with_varied_tasks _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ce445e5f-8897-4dbe-af69-f4741bb3a324:runs:019ce30a-6046-7460-8462-7c3032b1a1ae']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_run_forever_cancels_promptly_with_future_tasks ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d59f2d5b-4594-4404-95be-658719a67eca:runs:019ce30a-62b8-7261-a2ec-411478d228d3']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_run_until_finished_exits_promptly_with_future_tasks _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ad208701-43df-43f0-9eb5-73beb8308c9b:runs:019ce30a-6492-75e2-9759-fa18e67a7a71']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ___ ERROR at teardown of test_run_at_most_cancels_promptly_with_future_tasks ___ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3de825e5-95b6-4d31-bb32-9409f03c0629:runs:019ce30a-665f-75e5-bc9a-b1be1ca21964']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _______ ERROR at teardown of test_worker_drains_active_tasks_on_shutdown _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-49b3c02b-df25-42a9-929b-957bbe441f1d:runs:019ce30a-75d7-75e7-bc29-2cf139e219ae']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_perpetual_tasks_are_scheduled_close_to_target_time _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-3313dc2a-32cd-4aac-b4de-234ab81faafa:runs:my-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_can_exit_from_perpetual_tasks_that_queue_further_tasks _ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-d6a3df5f-01d2-4a79-ba02-1cbac30f47d9:runs:019ce30a-79b3-76d7-a4c4-2a1f10ecfc62']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _ ERROR at teardown of test_worker_can_exit_from_long_horizon_perpetual_tasks __ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-ecb7110d-f10a-429b-9d5b-e5790ab84bb2:runs:my-key']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_worker_timeout_exceeds_redelivery_timeout ______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6b26a070-22fa-42ff-941e-d0267d19c179:runs:019ce30a-7e21-7256-8208-31212243791c']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______ ERROR at teardown of test_replacement_race_condition_stream_tasks _______ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-757a5e2c-3b24-43e9-a1e0-e4fc22b2b3d5:runs:my-cool-task:a86adebe-ce6e-4ca3-9185-e997b70eeaf2']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ________ ERROR at teardown of test_replace_task_in_queue_before_stream _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-5ef7fc00-81cc-4ac4-ba76-cf337cb343f5:runs:my-cool-task:47737ece-6a07-4ec0-a4d8-79a72054d3d8']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ______________ ERROR at teardown of test_rapid_replace_operations ______________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-6d85ee68-2631-439c-893e-6e6a7e1db7a6:runs:my-cool-task:819f6ab7-8d15-410c-bbd1-2dad32ecc771']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError _____ ERROR at teardown of test_wrongtype_error_with_legacy_known_task_key _____ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-c62672df-c764-4d0c-b6ab-6bd0787f9ae4:progress:legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc', 'test-docket-c62672df-c764-4d0c-b6ab-6bd0787f9ae4:runs:legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:↪ [ 60ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} INFO:docket.task.trace:legacy task test: 'legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc' added to docket 'test-docket-c62672df-c764-4d0c-b6ab-6bd0787f9ae4' 0:00:00.060194 ago now running on worker 'bde710d1552f45a3b346e4ebac757aed#574' ERROR:docket.worker:↩ [ 1ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:828 ↪ [ 60ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} INFO docket.task.trace:tasks.py:24 legacy task test: 'legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc' added to docket 'test-docket-c62672df-c764-4d0c-b6ab-6bd0787f9ae4' 0:00:00.060194 ago now running on worker 'bde710d1552f45a3b346e4ebac757aed#574' ERROR docket.worker:worker.py:975 ↩ [ 1ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk _________ ERROR at teardown of test_replace_task_with_legacy_known_key _________ def finalizer() -> None: """Yield again, to finalize.""" async def async_finalizer() -> None: try: await gen_obj.__anext__() # type: ignore[union-attr] except StopAsyncIteration: pass else: msg = "Async generator fixture didn't stop." msg += "Yield only once." raise ValueError(msg) > runner.run(async_finalizer(), context=context) /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.14/asyncio/runners.py:127: in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/asyncio/base_events.py:719: in run_until_complete return future.result() ^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/pytest_asyncio/plugin.py:281: in async_finalizer await gen_obj.__anext__() # type: ignore[union-attr] ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/conftest.py:296: in key_leak_checker await checker.verify_remaining_keys_have_ttl() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = async def verify_remaining_keys_have_ttl(self) -> None: """Verify that all remaining keys either have TTL or are explicitly permanent. This prevents memory leaks by ensuring that any data keys created during operations will eventually expire. Keys without TTL are allowed only for tasks that are still scheduled/queued (not yet executed). Completed/failed tasks should have TTL set. """ async with self.docket.redis() as redis: # Get all keys for this docket (use :* to avoid matching dockets with suffixes) pattern = f"{self.docket_prefix}:*" keys_without_ttl: list[str] = [] # Use scan_iter instead of keys() for cluster compatibility async for key in redis.scan_iter(match=pattern): # type: ignore key_str = key.decode() if isinstance(key, bytes) else str(key) # type: ignore[reportUnknownArgumentType] # Skip explicitly permanent keys if key_str in self.permanent_keys: continue # Skip permanent key patterns if any(key_str.startswith(pat) for pat in self.permanent_patterns): continue # Skip exempted keys if key_str in self.exemptions: continue # Skip pattern-exempted keys if self.pattern_exemptions: if any(fnmatch(key_str, pat) for pat in self.pattern_exemptions): continue # Check TTL (-1 means no expiry, -2 means key doesn't exist) ttl = await redis.ttl(key_str) if ttl == -1: # Key has no TTL - check if it's for a scheduled task is_allowed = await self._is_scheduled_task_key(key_str, redis) if not is_allowed: keys_without_ttl.append(key_str) > assert not keys_without_ttl, ( ^^^^^^^^^^^^^^^^^^^^ f"Memory leak detected: The following keys have no TTL " f"and will never expire: {keys_without_ttl}. All data keys should have TTL set " f"to prevent permanent memory usage. Keys without TTL are only allowed for " f"tasks that are still scheduled/queued (not yet executed)." ) E AssertionError: Memory leak detected: The following keys have no TTL and will never expire: ['test-docket-038d080c-8b44-4c6a-953f-db1e1d4145ab:runs:legacy-replace-task:98b8d2d8-2c06-4fa2-bb77-0bb4ca522a69']. All data keys should have TTL set to prevent permanent memory usage. Keys without TTL are only allowed for tasks that are still scheduled/queued (not yet executed). tests/_key_leak_checker.py:118: AssertionError =================================== FAILURES =================================== ___________________________ test_rich_logging_format ___________________________ self = keys = ['test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:stream', 'test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:known:01...-7247-8896-e79ba3e3ed20', 'test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:runs:019ce307-0457-7247-8896-e79ba3e3ed20'] args = ('test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:stream', 'test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:known:01...-8896-e79ba3e3ed20', 'test-docket-6d85223b-4817-49f1-88eb-9663ffac8845:runs:019ce307-0457-7247-8896-e79ba3e3ed20', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: docket = async def test_rich_logging_format(docket: Docket): """Should use rich formatting for logs by default""" > await docket.add(trace)("hello") tests/cli/test_worker.py:114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_plain_logging_format ___________________________ docket = async def test_plain_logging_format(docket: Docket): """Should use plain formatting for logs when specified""" > await docket.add(trace)("hello") tests/cli/test_worker.py:137: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_basic_concurrency_limit _________________________ docket = worker = async def test_basic_concurrency_limit(docket: Docket, worker: Worker): """Test basic concurrency limiting functionality.""" results: list[str] = [] async def test_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): results.append(f"start_{customer_id}") await asyncio.sleep(0.01) # Short delay results.append(f"end_{customer_id}") # Schedule 2 tasks for the same customer > await docket.add(test_task)(customer_id=1) tests/concurrency_limits/test_basic.py:30: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_per_task_concurrency_limit ________________________ docket = worker = async def test_per_task_concurrency_limit(docket: Docket, worker: Worker): """Test concurrency limit without argument_name limits the task itself.""" execution_intervals: list[tuple[float, float]] = [] async def limited_task( task_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit(max_concurrent=2), ): start = time.monotonic() await asyncio.sleep(0.05) end = time.monotonic() execution_intervals.append((start, end)) # Schedule 4 tasks for i in range(4): > await docket.add(limited_task)(task_id=i) tests/concurrency_limits/test_basic.py:59: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_concurrency_limit_single_argument ____________________ docket = worker = async def test_concurrency_limit_single_argument(docket: Docket, worker: Worker): """Test that ConcurrencyLimit enforces single concurrent execution per argument value.""" execution_order: list[str] = [] execution_intervals: list[tuple[float, float]] = [] async def slow_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): start = time.monotonic() execution_order.append(f"start_{customer_id}") # Simulate some work await asyncio.sleep(0.2) end = time.monotonic() execution_order.append(f"end_{customer_id}") execution_intervals.append((start, end)) # Schedule multiple tasks for the same customer_id > await docket.add(slow_task)(customer_id=1) tests/concurrency_limits/test_basic.py:100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_concurrency_limit_different_arguments __________________ docket = worker = async def test_concurrency_limit_different_arguments(docket: Docket, worker: Worker): """Test that tasks with different argument values can run concurrently.""" execution_order: list[str] = [] execution_intervals: dict[int, tuple[float, float]] = {} async def slow_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): start = time.monotonic() execution_order.append(f"start_{customer_id}") # Simulate some work await asyncio.sleep(0.1) end = time.monotonic() execution_order.append(f"end_{customer_id}") execution_intervals[customer_id] = (start, end) # Schedule tasks for different customer_ids > await docket.add(slow_task)(customer_id=1) tests/concurrency_limits/test_basic.py:151: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_concurrency_limit_max_concurrent _____________________ docket = worker = async def test_concurrency_limit_max_concurrent(docket: Docket, worker: Worker): """Test that max_concurrent parameter works correctly.""" execution_order: list[str] = [] active_tasks: list[int] = [] max_concurrent_seen = 0 lock = asyncio.Lock() async def slow_task( task_id: int, db_name: str, concurrency: ConcurrencyLimit = ConcurrencyLimit("db_name", max_concurrent=2), ): nonlocal max_concurrent_seen async with lock: active_tasks.append(task_id) max_concurrent_seen = max(max_concurrent_seen, len(active_tasks)) execution_order.append(f"start_{task_id}") # Simulate some work await asyncio.sleep(0.1) async with lock: active_tasks.remove(task_id) execution_order.append(f"end_{task_id}") # Schedule 5 tasks for the same db_name (should be limited to 2 concurrent) for i in range(5): > await docket.add(slow_task)(task_id=i, db_name="postgres") tests/concurrency_limits/test_basic.py:195: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_concurrency_limit_missing_argument_error _________________ docket = worker = async def test_concurrency_limit_missing_argument_error(docket: Docket, worker: Worker): """Test that missing argument causes proper error handling.""" async def task_with_missing_arg( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "missing_arg", max_concurrent=1 ), ): pass # pragma: no cover > await docket.add(task_with_missing_arg)(customer_id=123) tests/concurrency_limits/test_basic.py:219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_concurrency_limit_with_custom_scope ___________________ docket = worker = async def test_concurrency_limit_with_custom_scope(docket: Docket, worker: Worker): """Test that custom scope parameter works correctly.""" execution_order: list[str] = [] # Use my-application: prefix for custom scopes (allowed by ACL for user-managed keys) async def task_with_scope( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1, scope="my-application:custom" ), ): execution_order.append(f"task_{customer_id}") > await docket.add(task_with_scope)(customer_id=1) tests/concurrency_limits/test_basic.py:238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_concurrency_limit_without_concurrency_dependency _____________ docket = worker = async def test_concurrency_limit_without_concurrency_dependency( docket: Docket, worker: Worker ): """Test that tasks without ConcurrencyLimit work normally.""" execution_count = 0 async def normal_task(customer_id: int): nonlocal execution_count execution_count += 1 # Schedule multiple tasks for i in range(5): > await docket.add(normal_task)(customer_id=i) tests/concurrency_limits/test_basic.py:280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_concurrency_keys_are_handled _______________________ docket = worker = async def test_concurrency_keys_are_handled( docket: Docket, worker: Worker, ) -> None: """Verify that concurrency limit keys are properly handled. Concurrency keys have explicit TTLs and are self-cleaning via Lua script, so they should not leak after task completion. """ async def task_with_concurrency( resource_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit("resource_id", 1), ) -> None: pass > await docket.add(task_with_concurrency)(resource_id=42) tests/concurrency_limits/test_basic.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_worker_concurrency_with_task_failures __________________ docket = async def test_worker_concurrency_with_task_failures(docket: Docket): """Test that concurrency slots are properly released when tasks fail""" execution_count = 0 failure_count = 0 async def failing_task( customer_id: int, should_fail: bool, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal execution_count, failure_count execution_count += 1 await asyncio.sleep(0.01) if should_fail: failure_count += 1 raise ValueError("Task failed intentionally") > await docket.add(failing_task)(customer_id=1, should_fail=True) tests/concurrency_limits/test_errors_and_resilience.py:43: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_worker_concurrency_error_handling_during_execution ____________ docket = async def test_worker_concurrency_error_handling_during_execution(docket: Docket): """Test that concurrency management handles errors gracefully during task execution""" tasks_executed = 0 error_count = 0 async def task_that_may_error( customer_id: int, should_error: bool, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal tasks_executed, error_count tasks_executed += 1 if should_error: error_count += 1 raise RuntimeError("Task execution error") > await docket.add(task_that_may_error)(customer_id=1, should_error=True) tests/concurrency_limits/test_errors_and_resilience.py:73: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_worker_concurrency_multiple_workers_coordination _____________ docket = async def test_worker_concurrency_multiple_workers_coordination(docket: Docket): """Test that multiple workers coordinate concurrency limits correctly""" worker1_executions = 0 worker2_executions = 0 total_concurrent = 0 max_concurrent_observed = 0 async def coordinated_task( customer_id: int, worker_name: str, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=2 ), ): nonlocal total_concurrent, max_concurrent_observed nonlocal worker1_executions, worker2_executions total_concurrent += 1 max_concurrent_observed = max(max_concurrent_observed, total_concurrent) if worker_name == "worker1": worker1_executions += 1 else: worker2_executions += 1 await asyncio.sleep(0.02) total_concurrent -= 1 for _ in range(4): > await docket.add(coordinated_task)(customer_id=1, worker_name="worker1") tests/concurrency_limits/test_errors_and_resilience.py:112: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_worker_concurrency_refresh_handles_redis_errors _____________ docket = async def test_worker_concurrency_refresh_handles_redis_errors(docket: Docket): """Test that concurrency refresh mechanism handles Redis errors gracefully""" task_completed = False async def task_with_concurrency( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal task_completed await asyncio.sleep(0.02) task_completed = True > await docket.add(task_with_concurrency)(customer_id=1) tests/concurrency_limits/test_errors_and_resilience.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_worker_concurrency_robustness_under_stress ________________ docket = async def test_worker_concurrency_robustness_under_stress(docket: Docket): """Test that concurrency management remains robust under stress conditions""" successful_executions = 0 max_concurrent = 0 current_concurrent = 0 async def stress_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=3 ), ): nonlocal successful_executions, max_concurrent, current_concurrent current_concurrent += 1 max_concurrent = max(max_concurrent, current_concurrent) try: await asyncio.sleep(0.005) successful_executions += 1 finally: current_concurrent -= 1 for _ in range(20): > await docket.add(stress_task)(customer_id=1) tests/concurrency_limits/test_errors_and_resilience.py:187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_worker_concurrency_edge_cases ______________________ docket = async def test_worker_concurrency_edge_cases(docket: Docket): """Test edge cases in concurrency management""" edge_case_handled = True async def edge_case_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): pass for _ in range(5): > await docket.add(edge_case_task)(customer_id=1) tests/concurrency_limits/test_errors_and_resilience.py:212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________ test_worker_graceful_shutdown_with_concurrency_management ___________ docket = async def test_worker_graceful_shutdown_with_concurrency_management(docket: Docket): """Test that workers shut down gracefully while managing concurrency""" task_started = asyncio.Event() task_completed = asyncio.Event() async def simple_task(): task_started.set() await asyncio.sleep(0.01) task_completed.set() > await docket.add(simple_task)() tests/concurrency_limits/test_errors_and_resilience.py:230: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_worker_concurrency_limits_task_queuing_behavior _____________ docket = async def test_worker_concurrency_limits_task_queuing_behavior(docket: Docket): """Test that concurrency limits control task execution properly""" execution_log: ContextVar[list[tuple[str, int]]] = ContextVar("execution_log") execution_log.set([]) async def task_with_concurrency( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=2 ), ): log = execution_log.get() log.append(("start", customer_id)) execution_log.set(log) await asyncio.sleep(0.01) log = execution_log.get() log.append(("end", customer_id)) execution_log.set(log) > await docket.add(task_with_concurrency)(customer_id=1) tests/concurrency_limits/test_execution_patterns.py:39: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_worker_concurrency_different_customer_branches ______________ docket = async def test_worker_concurrency_different_customer_branches(docket: Docket): """Test that different customer IDs are handled in separate branches""" customers_executed: set[int] = set() async def track_customer_execution( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): customers_executed.add(customer_id) await asyncio.sleep(0.01) for customer_id in [1, 2, 3]: > await docket.add(track_customer_execution)(customer_id=customer_id) tests/concurrency_limits/test_execution_patterns.py:74: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_worker_concurrency_limits_different_scopes ________________ docket = async def test_worker_concurrency_limits_different_scopes(docket: Docket): """Test that concurrency limits work correctly with different scopes""" task_executions: list[tuple[str, int]] = [] # Use my-application: prefix for custom scopes (allowed by ACL for user-managed keys) async def scoped_task( customer_id: int, scope_name: str, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1, scope="my-application:custom" ), ): task_executions.append((scope_name, customer_id)) await asyncio.sleep(0.01) async def default_scoped_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): task_executions.append(("default", customer_id)) await asyncio.sleep(0.01) > await docket.add(scoped_task)(customer_id=1, scope_name="custom") tests/concurrency_limits/test_execution_patterns.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_worker_concurrency_refresh_mechanism_integration _____________ docket = async def test_worker_concurrency_refresh_mechanism_integration(docket: Docket): """Test that concurrency refresh mechanism works in practice""" long_running_started = False quick_task_completed = False async def long_running_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal long_running_started long_running_started = True await asyncio.sleep(0.1) async def quick_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal quick_task_completed quick_task_completed = True > await docket.add(long_running_task)(customer_id=1) tests/concurrency_limits/test_execution_patterns.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_worker_concurrency_with_quick_tasks ___________________ docket = async def test_worker_concurrency_with_quick_tasks(docket: Docket): """Test that quick tasks complete without triggering complex cleanup paths""" completed_tasks = 0 async def quick_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=2 ), ): nonlocal completed_tasks completed_tasks += 1 for _ in range(5): > await docket.add(quick_task)(customer_id=1) tests/concurrency_limits/test_execution_patterns.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_worker_concurrency_with_dependencies_integration _____________ docket = async def test_worker_concurrency_with_dependencies_integration(docket: Docket): """Test that concurrency limits work correctly with dependency injection""" task_completed = False current_worker_name = None async def task_with_dependencies( customer_id: int, worker: Worker = CurrentWorker(), concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal task_completed, current_worker_name current_worker_name = worker.name await asyncio.sleep(0.01) task_completed = True > await docket.add(task_with_dependencies)(customer_id=1) tests/concurrency_limits/test_execution_patterns.py:192: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_concurrency_limited_task_successfully_acquires_slot ___________ docket = async def test_concurrency_limited_task_successfully_acquires_slot(docket: Docket): """Tasks with concurrency limits successfully acquire slots when available""" executed: list[int] = [] async def limited_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=2, ), ) -> None: executed.append(customer_id) await asyncio.sleep(0.01) > await docket.add(limited_task)(customer_id=1) tests/concurrency_limits/test_execution_patterns.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_task_timeout_with_explicit_timeout ____________________ docket = async def test_task_timeout_with_explicit_timeout(docket: Docket): """Test that tasks with explicit Timeout are timed out correctly.""" task_started = False task_completed = False event = asyncio.Event() async def long_running_task( customer_id: int, test_mode: str = "timeout", concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), timeout: Timeout = Timeout(timedelta(seconds=1)), ): nonlocal task_started, task_completed task_started = True if test_mode == "complete": # Fast completion for coverage await asyncio.sleep(0.01) task_completed = True elif test_mode == "long_complete": # Long running but within timeout for coverage await asyncio.sleep(0.5) # Within the 1-second timeout task_completed = True else: # Simulate a task that would run longer than timeout # Don't set event - task will hang and be timed out await event.wait() docket.register(long_running_task) async with Worker( docket, minimum_check_interval=timedelta(milliseconds=50), scheduling_resolution=timedelta(milliseconds=50), redelivery_timeout=timedelta(seconds=3), ) as worker: # Schedule the long-running task > await docket.add(long_running_task)(customer_id=1) tests/concurrency_limits/test_redelivery.py:51: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_task_timeout_with_concurrent_tasks ____________________ docket = async def test_task_timeout_with_concurrent_tasks(docket: Docket): """Test that concurrency control works with hard timeouts.""" tasks_started: list[int] = [] tasks_completed: list[int] = [] async def task_within_timeout( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=2 ), ): tasks_started.append(customer_id) # Task that completes within timeout await asyncio.sleep(1) tasks_completed.append(customer_id) # Create a worker with reasonable timeout async with Worker( docket, minimum_check_interval=timedelta(milliseconds=5), scheduling_resolution=timedelta(milliseconds=5), redelivery_timeout=timedelta(seconds=3), # Tasks will timeout after 3 seconds ) as worker: # Schedule multiple tasks for the same customer (will run concurrently up to limit) for _ in range(3): # 3 tasks, but max_concurrent=2 > await docket.add(task_within_timeout)(customer_id=1) tests/concurrency_limits/test_redelivery.py:104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_explicit_timeout_limits_long_tasks ____________________ docket = async def test_explicit_timeout_limits_long_tasks(docket: Docket): """Test that tasks with explicit Timeout longer than the limit are terminated.""" task_completed = False event = asyncio.Event() async def long_task( customer_id: int, test_mode: str = "timeout", concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), timeout: Timeout = Timeout(timedelta(seconds=1)), ): nonlocal task_completed if test_mode == "complete": # Fast completion for coverage await asyncio.sleep(0.01) task_completed = True elif test_mode == "long_complete": # Long running but completes within timeout await asyncio.sleep(0.5) # Less than 1 second timeout task_completed = True else: # Simulate a task that would run longer than timeout # Don't set event - task will hang and be timed out await event.wait() docket.register(long_task) async with Worker( docket, minimum_check_interval=timedelta(milliseconds=50), scheduling_resolution=timedelta(milliseconds=50), redelivery_timeout=timedelta(seconds=3), ) as worker: # Schedule long-running task > await docket.add(long_task)(customer_id=1) tests/concurrency_limits/test_redelivery.py:150: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_short_tasks_complete_within_timeout ___________________ docket = async def test_short_tasks_complete_within_timeout(docket: Docket): """Test that short tasks complete successfully within redelivery timeout.""" tasks_completed = 0 async def short_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal tasks_completed await asyncio.sleep(0.1) # Very short task tasks_completed += 1 async with Worker( docket, minimum_check_interval=timedelta(milliseconds=50), scheduling_resolution=timedelta(milliseconds=50), redelivery_timeout=timedelta(seconds=3), ) as worker: # Schedule multiple short tasks for _ in range(5): > await docket.add(short_task)(customer_id=1) tests/concurrency_limits/test_redelivery.py:193: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_redeliveries_respect_concurrency_limits _________________ docket = async def test_redeliveries_respect_concurrency_limits(docket: Docket): """Test that redelivered tasks still respect concurrency limits""" task_executions: list[tuple[int, float, float]] = [] # (customer_id, start, end) failure_count = 0 async def task_that_sometimes_fails( customer_id: int, should_fail: bool, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1, ), ): nonlocal failure_count start = time.monotonic() await asyncio.sleep(0.02) end = time.monotonic() task_executions.append((customer_id, start, end)) if should_fail: failure_count += 1 raise ValueError("Intentional failure for testing") # Schedule tasks: some will fail initially, others succeed > await docket.add(task_that_sometimes_fails)(customer_id=1, should_fail=True) tests/concurrency_limits/test_redelivery.py:231: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_concurrency_blocked_task_executes_exactly_once ______________ docket = async def test_concurrency_blocked_task_executes_exactly_once(docket: Docket): """Concurrency limits should prevent tasks for the same customer from overlapping, while allowing parallelism across different customers. This test uses TWO separate workers to ensure concurrency limits work across workers, not just within a single worker. This is important because xautoclaim can reclaim messages from one worker and deliver them to another. """ executions: list[tuple[int, float, float, str]] = [] async def tracked_task( customer_id: int, execution: Execution = CurrentExecution(), concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1, ), ) -> None: start = time.monotonic() await asyncio.sleep(0.02) end = time.monotonic() executions.append((customer_id, start, end, execution.key)) # Schedule 5 tasks for each of 3 customers for customer_id in [1, 2, 3]: for _ in range(5): > await docket.add(tracked_task)(customer_id=customer_id) tests/concurrency_limits/test_redelivery.py:282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_worker_concurrency_missing_argument_fails_task ______________ docket = async def test_worker_concurrency_missing_argument_fails_task(docket: Docket): """Test that tasks with missing concurrency arguments fail with clear error""" task_executed = False async def task_missing_concurrency_arg( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "missing_param", max_concurrent=1, ), ): nonlocal task_executed task_executed = True # pragma: no cover > await docket.add(task_missing_concurrency_arg)(customer_id=1) tests/concurrency_limits/test_worker_mechanics.py:30: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_worker_concurrency_no_limit_early_return _________________ docket = async def test_worker_concurrency_no_limit_early_return(docket: Docket): """Test tasks without concurrency limits execute normally""" task_executed = False async def task_without_concurrency(customer_id: int): nonlocal task_executed task_executed = True > await docket.add(task_without_concurrency)(customer_id=1) tests/concurrency_limits/test_worker_mechanics.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________ test_worker_concurrency_missing_argument_shows_available_args _________ docket = async def test_worker_concurrency_missing_argument_shows_available_args(docket: Docket): """Test that missing argument error shows available arguments for debugging.""" task_executed = False async def task_missing_concurrency_arg( actual_param: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "missing_param", max_concurrent=1 ), ): nonlocal task_executed task_executed = True # pragma: no cover > await docket.add(task_missing_concurrency_arg)(actual_param=42) tests/concurrency_limits/test_worker_mechanics.py:68: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_worker_concurrency_cleanup_on_success __________________ docket = async def test_worker_concurrency_cleanup_on_success(docket: Docket): """Test that concurrency slots are released when tasks complete successfully""" completed_tasks: list[int] = [] async def successful_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): completed_tasks.append(customer_id) await asyncio.sleep(0.01) > await docket.add(successful_task)(customer_id=1) tests/concurrency_limits/test_worker_mechanics.py:90: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_worker_concurrency_cleanup_on_failure __________________ docket = async def test_worker_concurrency_cleanup_on_failure(docket: Docket): """Test that concurrency slots are released when tasks fail""" execution_results: list[tuple[str, int, bool]] = [] async def task_that_may_fail( customer_id: int, should_fail: bool, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): execution_results.append(("executed", customer_id, should_fail)) await asyncio.sleep(0.01) if should_fail: raise ValueError("Intentional test failure") > await docket.add(task_that_may_fail)(customer_id=1, should_fail=True) tests/concurrency_limits/test_worker_mechanics.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_worker_concurrency_cleanup_after_task_completion _____________ docket = async def test_worker_concurrency_cleanup_after_task_completion(docket: Docket): """Test that concurrency slots are properly cleaned up after task completion""" cleanup_verified = False async def task_with_cleanup_verification( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): await asyncio.sleep(0.01) > await docket.add(task_with_cleanup_verification)(customer_id=1) tests/concurrency_limits/test_worker_mechanics.py:144: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_worker_handles_concurrent_task_cleanup_gracefully ____________ docket = async def test_worker_handles_concurrent_task_cleanup_gracefully(docket: Docket): """Test that worker handles task cleanup correctly under concurrent execution""" cleanup_success = True task_count = 0 async def cleanup_test_task( customer_id: int, should_fail: bool = False, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal task_count, cleanup_success task_count += 1 try: await asyncio.sleep(0.01) if should_fail: raise ValueError("Test exception for coverage") except Exception: cleanup_success = False raise for _ in range(2): > await docket.add(cleanup_test_task)(customer_id=1, should_fail=False) tests/concurrency_limits/test_worker_mechanics.py:179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_finally_block_releases_concurrency_on_success ______________ docket = async def test_finally_block_releases_concurrency_on_success(docket: Docket): """Test that concurrency slot is released when task completes successfully.""" task_completed = False async def successful_task( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ): nonlocal task_completed await asyncio.sleep(0.01) task_completed = True > await docket.add(successful_task)(customer_id=1) tests/concurrency_limits/test_worker_mechanics.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_stale_concurrency_slots_are_scavenged_when_full _____________ docket = async def test_stale_concurrency_slots_are_scavenged_when_full(docket: Docket): """Test that stale slots are scavenged on-demand when concurrency is full. Slots are only scavenged when a new task needs one and all slots are taken. This is a distributed approach - each worker cleans up as needed rather than proactive garbage collection. """ task_completed = False async def task_with_concurrency( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=2 ), ): nonlocal task_completed task_completed = True # Manually insert stale slots into the concurrency sorted set. # These simulate slots from workers that crashed without releasing. concurrency_key = f"{docket.name}:concurrency:customer_id:123" stale_timestamp = ( datetime.now(timezone.utc).timestamp() - 400 ) # >redelivery_timeout old async with docket.redis() as redis: # Add two stale slots that fill up max_concurrent await redis.zadd(concurrency_key, {"stale_task_1": stale_timestamp}) # type: ignore await redis.zadd(concurrency_key, {"stale_task_2": stale_timestamp}) # type: ignore # Verify stale slots are present count_before = await redis.zcard(concurrency_key) # type: ignore assert count_before == 2 # Run a task - this should scavenge ONE stale slot and execute > await docket.add(task_with_concurrency)(customer_id=123) tests/concurrency_limits/test_worker_mechanics.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_graceful_shutdown_releases_concurrency_slots _______________ docket = async def test_graceful_shutdown_releases_concurrency_slots(docket: Docket): """Verify that concurrency slots are released when worker shuts down gracefully. When a worker receives a shutdown signal while tasks are running, it should drain the active tasks (let them complete) and release their concurrency slots. """ task_started = asyncio.Event() task_can_finish = asyncio.Event() task_completed = False async def slow_task_with_concurrency( customer_id: int, concurrency: ConcurrencyLimit = ConcurrencyLimit( "customer_id", max_concurrent=1 ), ) -> None: nonlocal task_completed task_started.set() await task_can_finish.wait() task_completed = True > await docket.add(slow_task_with_concurrency)(customer_id=42) tests/concurrency_limits/test_worker_mechanics.py:286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_simple_function_dependencies _______________________ docket = worker = async def test_simple_function_dependencies(docket: Docket, worker: Worker): """A task can depend on the return value of simple functions""" async def dependency_one() -> str: return f"one-{uuid4()}" async def dependency_two() -> str: return f"two-{uuid4()}" called = 0 async def dependent_task( one_a: str = Depends(dependency_one), one_b: str = Depends(dependency_one), two: str = Depends(dependency_two), ): assert one_a.startswith("one-") assert one_b == one_a assert two.startswith("two-") nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_contextual_dependencies _________________________ docket = worker = async def test_contextual_dependencies(docket: Docket, worker: Worker): """A task can depend on the return value of async context managers""" stages: list[str] = [] @asynccontextmanager async def dependency_one() -> AsyncGenerator[str, None]: stages.append("one-before") yield f"one-{uuid4()}" stages.append("one-after") async def dependency_two() -> str: return f"two-{uuid4()}" called = 0 async def dependent_task( one_a: str = Depends(dependency_one), one_b: str = Depends(dependency_one), two: str = Depends(dependency_two), ): assert one_a.startswith("one-") assert one_b == one_a assert two.startswith("two-") nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:80: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_dependencies_of_dependencies _______________________ docket = worker = async def test_dependencies_of_dependencies(docket: Docket, worker: Worker): """A task dependency can depend on other dependencies""" counter = 0 async def dependency_one() -> list[str]: nonlocal counter counter += 1 return [f"one-{counter}"] async def dependency_two(my_one: list[str] = Depends(dependency_one)) -> list[str]: nonlocal counter counter += 1 return my_one + [f"two-{counter}"] async def dependency_three( my_one: list[str] = Depends(dependency_one), my_two: list[str] = Depends(dependency_two), ) -> list[str]: nonlocal counter counter += 1 return my_one + my_two + [f"three-{counter}"] async def dependent_task( one_a: list[str] = Depends(dependency_one), one_b: list[str] = Depends(dependency_one), two: list[str] = Depends(dependency_two), three: list[str] = Depends(dependency_three), ): assert one_a is one_b assert one_a == ["one-1"] assert two == ["one-1", "two-2"] assert three == ["one-1", "two-2", "three-3"] > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:122: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_dependencies_can_ask_for_docket_dependencies _______________ docket = worker = async def test_dependencies_can_ask_for_docket_dependencies( docket: Docket, worker: Worker ): """A task dependency can ask for a docket dependency""" called = 0 async def dependency_one(this_docket: Docket = CurrentDocket()) -> str: assert this_docket is docket nonlocal called called += 1 return f"one-{called}" async def dependency_two( this_worker: Worker = CurrentWorker(), one: str = Depends(dependency_one), ) -> str: assert this_worker is worker assert one == "one-1" nonlocal called called += 1 return f"two-{called}" async def dependent_task( one: str = Depends(dependency_one), two: str = Depends(dependency_two), this_docket: Docket = CurrentDocket(), this_worker: Worker = CurrentWorker(), ): assert one == "one-1" assert two == "two-2" assert this_docket is docket assert this_worker is worker nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_dependency_failures_are_task_failures __________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff964d6d60> async def test_dependency_failures_are_task_failures( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """A task dependency failure will cause the task to fail""" called: bool = False async def dependency_one() -> str: raise ValueError("this one is bad") async def dependency_two() -> str: raise ValueError("and so is this one") async def dependent_task( a: str = Depends(dependency_one), b: str = Depends(dependency_two), ) -> None: nonlocal called called = True # pragma: no cover > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:195: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_contextual_dependency_before_failures_are_task_failures _________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff964d74d0> async def test_contextual_dependency_before_failures_are_task_failures( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """A contextual task dependency failure will cause the task to fail""" called: int = 0 @asynccontextmanager async def dependency_before() -> AsyncGenerator[str, None]: raise ValueError("this one is bad") yield "this won't be used" # pragma: no cover async def dependent_task( a: str = Depends(dependency_before), ) -> None: nonlocal called called += 1 # pragma: no cover > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_contextual_dependency_after_failures_are_task_failures __________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff968255c0> async def test_contextual_dependency_after_failures_are_task_failures( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """A contextual task dependency failure will cause the task to fail""" called: int = 0 @asynccontextmanager async def dependency_after() -> AsyncGenerator[str, None]: yield "this will be used" raise ValueError("this one is bad") async def dependent_task( a: str = Depends(dependency_after), ) -> None: assert a == "this will be used" nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_async_dependencies.py:256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_dependencies_can_ask_for_task_arguments _________________ docket = worker = async def test_dependencies_can_ask_for_task_arguments(docket: Docket, worker: Worker): """A task dependency can ask for a task argument""" called = 0 async def dependency_one(a: list[str] = TaskArgument()) -> list[str]: return a async def dependency_two(another_name: list[str] = TaskArgument("a")) -> list[str]: return another_name async def dependent_task( a: list[str], b: list[str] = TaskArgument("a"), c: list[str] = Depends(dependency_one), d: list[str] = Depends(dependency_two), ) -> None: assert a is b assert a is c assert a is d nonlocal called called += 1 > await docket.add(dependent_task)(a=["hello", "world"]) tests/fundamentals/test_async_dependencies.py:290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_task_arguments_may_be_optional ______________________ docket = worker = async def test_task_arguments_may_be_optional(docket: Docket, worker: Worker): """A task dependency can ask for a task argument optionally""" called = 0 async def dependency_one( a: list[str] | None = TaskArgument(optional=True), ) -> list[str] | None: return a async def dependent_task( not_a: list[str], b: list[str] | None = Depends(dependency_one), ) -> None: assert not_a == ["hello", "world"] assert b is None nonlocal called called += 1 > await docket.add(dependent_task)(not_a=["hello", "world"]) tests/fundamentals/test_async_dependencies.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_all_dockets_have_a_trace_task ______________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff96492cf0> async def test_all_dockets_have_a_trace_task( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """All dockets should have a trace task""" > await docket.add(tasks.trace)("Hello, world!") tests/fundamentals/test_builtin_tasks.py:15: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_all_dockets_have_a_fail_task _______________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff967f46e0> async def test_all_dockets_have_a_fail_task( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """All dockets should have a fail task""" > await docket.add(tasks.fail)("Hello, world!") tests/fundamentals/test_builtin_tasks.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_cancelling_future_task __________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_cancelling_future_task( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket should allow for cancelling a task""" soon = now() + timedelta(milliseconds=100) > execution = await docket.add(the_task, soon)("a", "b", c="c") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_cancellation.py:17: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_cancelling_immediate_task ________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_cancelling_immediate_task( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket can cancel a task that is scheduled immediately""" > execution = await docket.add(the_task, now())("a", "b", c="c") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_cancellation.py:33: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_cancellation_is_idempotent ________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_cancellation_is_idempotent( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Test that canceling the same task twice doesn't error.""" key = f"test-task:{uuid4()}" # Schedule a task later = now() + timedelta(seconds=1) > await docket.add(the_task, later, key=key)("test") tests/fundamentals/test_cancellation.py:52: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_supports_requesting_current_docket ____________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_requesting_current_docket( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support providing the current docket to a task""" called = False async def the_task(a: str, b: str, this_docket: Docket = CurrentDocket()): assert a == "a" assert b == "c" assert this_docket is docket nonlocal called called = True > await docket.add(the_task)("a", b="c") tests/fundamentals/test_context_injection.py:32: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_supports_requesting_current_worker ____________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_requesting_current_worker( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support providing the current worker to a task""" called = False async def the_task(a: str, b: str, this_worker: Worker = CurrentWorker()): assert a == "a" assert b == "c" assert this_worker is worker nonlocal called called = True > await docket.add(the_task)("a", b="c") tests/fundamentals/test_context_injection.py:54: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_supports_requesting_current_execution __________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_requesting_current_execution( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support providing the current execution to a task""" called = False async def the_task(a: str, b: str, this_execution: Execution = CurrentExecution()): assert a == "a" assert b == "c" assert isinstance(this_execution, Execution) assert this_execution.key == "my-cool-task:123" nonlocal called called = True > await docket.add(the_task, key="my-cool-task:123")("a", b="c") tests/fundamentals/test_context_injection.py:78: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_supports_requesting_current_task_key ___________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_requesting_current_task_key( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support providing the current task key to a task""" called = False async def the_task(a: str, b: str, this_key: str = TaskKey()): assert a == "a" assert b == "c" assert this_key == "my-cool-task:123" nonlocal called called = True > await docket.add(the_task, key="my-cool-task:123")("a", b="c") tests/fundamentals/test_context_injection.py:100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_cron_task_reschedules_itself _______________________ docket = worker = async def test_cron_task_reschedules_itself(docket: Docket, worker: Worker): """Cron tasks automatically reschedule after each execution.""" runs = 0 async def my_cron_task(cron: Cron = Cron("0 9 * * *", automatic=False)): nonlocal runs runs += 1 # Patch croniter.get_next to return a time 10ms in the future with patch.object( croniter, "get_next", return_value=datetime.now(timezone.utc) + timedelta(milliseconds=10), ): > execution = await docket.add(my_cron_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_cron.py:30: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_cron_tasks_are_automatically_scheduled __________________ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/fundamentals/test_cron.py", line 53, in test_cron_tasks_are_automatically_scheduled | await worker.run_at_most({"my_automatic_cron": 2}) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 402, in run_at_most | await self.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 557, in _worker_loop | await self._schedule_all_automatic_perpetual_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 775, in _schedule_all_automatic_perpetual_tasks | await self.docket.add( | task_function, when=perpetual.initial_when, key=key | )() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py", line 372, in scheduler | await execution.schedule(replace=False) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 485, in schedule | await schedule_script( | ...<21 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:135: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* my_automatic_cron(...) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * my_automatic_cron(...) ____________________ test_cron_tasks_continue_after_errors _____________________ docket = worker = async def test_cron_tasks_continue_after_errors(docket: Docket, worker: Worker): """Cron tasks keep rescheduling even when they raise exceptions.""" calls = 0 async def flaky_cron_task(cron: Cron = Cron("0 * * * *", automatic=False)): nonlocal calls calls += 1 raise ValueError("Task failed!") with patch.object( croniter, "get_next", return_value=datetime.now(timezone.utc) + timedelta(milliseconds=10), ): > execution = await docket.add(flaky_cron_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_cron.py:72: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_cron_tasks_can_cancel_themselves _____________________ docket = worker = async def test_cron_tasks_can_cancel_themselves(docket: Docket, worker: Worker): """A cron task can stop rescheduling by calling cron.cancel().""" calls = 0 async def limited_cron_task(cron: Cron = Cron("0 * * * *", automatic=False)): nonlocal calls calls += 1 if calls >= 3: cron.cancel() with patch.object( croniter, "get_next", return_value=datetime.now(timezone.utc) + timedelta(milliseconds=10), ): > await docket.add(limited_cron_task)() tests/fundamentals/test_cron.py:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_cron_supports_vixie_keywords _______________________ docket = worker = async def test_cron_supports_vixie_keywords(docket: Docket, worker: Worker): """Cron supports Vixie cron keywords like @daily, @weekly, @hourly.""" runs = 0 # @daily is equivalent to "0 0 * * *" (midnight every day) async def daily_task(cron: Cron = Cron("@daily", automatic=False)): nonlocal runs runs += 1 with patch.object( croniter, "get_next", return_value=datetime.now(timezone.utc) + timedelta(milliseconds=10), ): > execution = await docket.add(daily_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_cron.py:113: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_automatic_cron_waits_for_scheduled_time _________________ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/fundamentals/test_cron.py", line 136, in test_automatic_cron_waits_for_scheduled_time | await worker.run_at_most({"scheduled_task": 1}) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 402, in run_at_most | await self.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 557, in _worker_loop | await self._schedule_all_automatic_perpetual_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 775, in _schedule_all_automatic_perpetual_tasks | await self.docket.add( | task_function, when=perpetual.initial_when, key=key | )() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py", line 372, in scheduler | await execution.schedule(replace=False) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 485, in schedule | await schedule_script( | ...<21 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:135: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* scheduled_task(...) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * scheduled_task(...) ___________________________ test_cron_with_timezone ____________________________ docket = worker = @pytest.mark.skipif( sys.platform == "win32", reason="Timing-sensitive: unreliable on Windows" ) async def test_cron_with_timezone(docket: Docket, worker: Worker): """Cron tasks can be scheduled in a specific timezone.""" runs = 0 pacific = ZoneInfo("America/Los_Angeles") async def pacific_task(cron: Cron = Cron("0 9 * * *", tz=pacific, automatic=False)): nonlocal runs runs += 1 with patch.object( croniter, "get_next", return_value=datetime.now(pacific) + timedelta(milliseconds=10), ): > execution = await docket.add(pacific_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_cron.py:161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_adding_task_with_unbindable_arguments __________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff96bba350> async def test_adding_task_with_unbindable_arguments( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture, ): """Should not raise an error when a task is scheduled or executed with incorrect arguments.""" async def task_with_specific_args(a: str, b: int, c: bool = False) -> None: pass # pragma: no cover > await docket.add(task_with_specific_args)("a", 2, d="unexpected") # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_errors.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_adding_is_idempotent ___________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_adding_is_idempotent( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Adding a task with the same key twice should only run the first one.""" key = f"my-cool-task:{uuid4()}" soon = now() + timedelta(milliseconds=10) > await docket.add(the_task, soon, key=key)("a", "b", c="c") tests/fundamentals/test_idempotency.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_task_keys_are_idempotent_in_the_future __________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_task_keys_are_idempotent_in_the_future( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """A future task blocks an immediate task with the same key.""" key = f"my-cool-task:{uuid4()}" soon = now() + timedelta(milliseconds=10) > await docket.add(the_task, when=soon, key=key)("a", "b", c="c") tests/fundamentals/test_idempotency.py:41: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_task_keys_are_idempotent_between_the_future_and_present _________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_task_keys_are_idempotent_between_the_future_and_present( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """An immediate task blocks a future task with the same key.""" key = f"my-cool-task:{uuid4()}" soon = now() + timedelta(milliseconds=10) > await docket.add(the_task, when=now(), key=key)("a", "b", c="c") tests/fundamentals/test_idempotency.py:65: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_task_keys_are_idempotent_in_the_present _________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_task_keys_are_idempotent_in_the_present( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Two immediate tasks with the same key only runs the first one.""" key = f"my-cool-task:{uuid4()}" > await docket.add(the_task, when=now(), key=key)("a", "b", c="c") tests/fundamentals/test_idempotency.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_tasks_can_opt_into_argument_logging ___________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff968ed010> async def test_tasks_can_opt_into_argument_logging( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """Tasks can opt into argument logging for specific arguments""" async def the_task( a: Annotated[str, Logged], b: str, c: Annotated[str, Logged()] = "c", d: Annotated[str, "nah chief"] = "d", docket: Docket = CurrentDocket(), ): pass > await docket.add(the_task)("value-a", b="value-b", c="value-c", d="value-d") tests/fundamentals/test_logging.py:26: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_tasks_can_opt_into_logging_collection_lengths ______________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff96546e40> async def test_tasks_can_opt_into_logging_collection_lengths( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """Tasks can opt into logging the length of collections""" async def the_task( a: Annotated[list[str], Logged(length_only=True)], b: Annotated[dict[str, str], Logged(length_only=True)], c: Annotated[tuple[str, ...], Logged(length_only=True)], d: Annotated[set[str], Logged(length_only=True)], e: Annotated[int, Logged(length_only=True)], docket: Docket = CurrentDocket(), ): pass > await docket.add(the_task)( ["a", "b"], b={"d": "e", "f": "g"}, c=("h", "i"), d={"a", "b", "c"}, e=123 ) tests/fundamentals/test_logging.py:55: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_logging_inside_of_task __________________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff965469e0> async def test_logging_inside_of_task( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture, ): """docket should support providing a logger with task context""" called = False async def the_task( a: str, b: str, logger: "LoggerAdapter[logging.Logger]" = TaskLogger() ): assert a == "a" assert b == "c" logger.info("Task is running") nonlocal called called = True > await docket.add(the_task, key="my-cool-task:123")("a", b="c") tests/fundamentals/test_logging.py:86: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________________ test_perpetual_tasks _____________________________ docket = worker = async def test_perpetual_tasks(docket: Docket, worker: Worker): """Perpetual tasks should reschedule themselves forever""" calls = 0 async def perpetual_task( a: str, b: int, perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), ): assert a == "a" assert b == 2 assert isinstance(perpetual, Perpetual) assert perpetual.every == timedelta(milliseconds=50) nonlocal calls calls += 1 > execution = await docket.add(perpetual_task)(a="a", b=2) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_perpetual.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_perpetual_tasks_can_cancel_themselves __________________ docket = worker = async def test_perpetual_tasks_can_cancel_themselves(docket: Docket, worker: Worker): """A perpetual task can request its own cancellation""" calls = 0 async def perpetual_task( a: str, b: int, perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), ): assert a == "a" assert b == 2 assert isinstance(perpetual, Perpetual) assert perpetual.every == timedelta(milliseconds=50) nonlocal calls calls += 1 if calls == 3: perpetual.cancel() > await docket.add(perpetual_task)(a="a", b=2) tests/fundamentals/test_perpetual.py:57: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_perpetual_tasks_can_change_their_parameters _______________ docket = worker = async def test_perpetual_tasks_can_change_their_parameters( docket: Docket, worker: Worker ): """Perpetual tasks may change their parameters each time""" arguments: list[tuple[str, int]] = [] async def perpetual_task( a: str, b: int, perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), ): arguments.append((a, b)) perpetual.perpetuate(a + "a", b=b + 1) > execution = await docket.add(perpetual_task)(a="a", b=1) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_perpetual.py:78: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_perpetual_tasks_perpetuate_even_after_errors _______________ docket = worker = async def test_perpetual_tasks_perpetuate_even_after_errors( docket: Docket, worker: Worker ): """Perpetual tasks keep rescheduling even when they raise exceptions.""" calls = 0 async def perpetual_task( a: str, b: int, perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), ): nonlocal calls calls += 1 raise ValueError("woops!") > execution = await docket.add(perpetual_task)(a="a", b=1) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_perpetual.py:102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_perpetual_tasks_can_be_automatically_scheduled ______________ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/fundamentals/test_perpetual.py", line 132, in test_perpetual_tasks_can_be_automatically_scheduled | await worker.run_at_most({"my_automatic_task": 3}) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 402, in run_at_most | await self.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 557, in _worker_loop | await self._schedule_all_automatic_perpetual_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 775, in _schedule_all_automatic_perpetual_tasks | await self.docket.add( | task_function, when=perpetual.initial_when, key=key | )() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py", line 372, in scheduler | await execution.schedule(replace=False) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 485, in schedule | await schedule_script( | ...<21 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:121: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* my_automatic_task(...) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * my_automatic_task(...) ____________ test_perpetual_tasks_can_schedule_next_run_after_delay ____________ docket = worker = async def test_perpetual_tasks_can_schedule_next_run_after_delay( docket: Docket, worker: Worker ): """Perpetual.after() lets tasks control when the next run happens.""" run_times: list[datetime] = [] async def perpetual_task( perpetual: Perpetual = Perpetual(), ): run_times.append(datetime.now(timezone.utc)) perpetual.after(timedelta(milliseconds=100)) > execution = await docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_perpetual.py:149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_cancelled_automatic_perpetual_can_be_rescheduled _____________ docket = worker = async def test_cancelled_automatic_perpetual_can_be_rescheduled( docket: Docket, worker: Worker ): """After cancelling a scheduled Perpetual task, add() can reschedule it.""" calls = 0 async def my_auto_task( perpetual: Perpetual = Perpetual( every=timedelta(milliseconds=50), automatic=True ), ): nonlocal calls calls += 1 # Schedule the task but don't run a worker yet — leave it sitting in the queue > await docket.add(my_auto_task, key="my_auto_task")() tests/fundamentals/test_perpetual.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_perpetual_tasks_can_schedule_next_run_at_specific_time __________ docket = worker = async def test_perpetual_tasks_can_schedule_next_run_at_specific_time( docket: Docket, worker: Worker ): """Perpetual.at() lets tasks schedule the next run at an absolute time.""" run_times: list[datetime] = [] async def perpetual_task( perpetual: Perpetual = Perpetual(), ): run_times.append(datetime.now(timezone.utc)) perpetual.at(datetime.now(timezone.utc) + timedelta(milliseconds=100)) > execution = await docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_perpetual.py:200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_tasks_can_report_progress ________________________ docket = worker = async def test_tasks_can_report_progress(docket: Docket, worker: Worker): """docket should support tasks reporting their progress""" called = False async def the_task( a: str, b: str, progress: Progress = Progress(), ): assert a == "a" assert b == "c" # Set the total expected work await progress.set_total(100) # Increment progress await progress.increment(10) await progress.increment(20) # Set a status message await progress.set_message("Processing items...") # Read back current progress assert progress.current == 30 assert progress.total == 100 assert progress.message == "Processing items..." nonlocal called called = True > await docket.add(the_task, key="progress-task:123")("a", b="c") tests/fundamentals/test_progress_state.py:49: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_tasks_can_access_execution_state _____________________ docket = worker = async def test_tasks_can_access_execution_state(docket: Docket, worker: Worker): """docket should support providing execution state and metadata to a task""" called = False async def the_task( a: str, b: str, this_execution: Execution = CurrentExecution(), ): assert a == "a" assert b == "c" assert isinstance(this_execution, Execution) assert this_execution.key == "stateful-task:123" assert this_execution.state == ExecutionState.RUNNING assert this_execution.worker is not None assert this_execution.started_at is not None nonlocal called called = True > await docket.add(the_task, key="stateful-task:123")("a", b="c") tests/fundamentals/test_progress_state.py:78: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_execution_state_lifecycle ________________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_execution_state_lifecycle( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket executions transition through states: QUEUED → RUNNING → COMPLETED""" async def successful_task(): await asyncio.sleep(0.01) async def failing_task(): await asyncio.sleep(0.01) raise ValueError("Task failed") # Test successful execution lifecycle > execution = await docket.add( successful_task, key="success:123", when=now() + timedelta(seconds=1) )() tests/fundamentals/test_progress_state.py:98: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_task_results_can_be_stored_and_retrieved _________________ docket = worker = async def test_task_results_can_be_stored_and_retrieved(docket: Docket, worker: Worker): """Test that string results are stored and retrievable.""" result_value = "hello world" async def returns_str() -> str: return result_value docket.register(returns_str) > execution = await docket.add(returns_str)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_results.py:14: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________________ test_errors_are_logged ____________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff96972200> async def test_errors_are_logged( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime], caplog: pytest.LogCaptureFixture, ): """docket should log errors when a task fails""" the_task.side_effect = Exception("Faily McFailerson") > await docket.add(the_task, now())("a", "b", c="c") tests/fundamentals/test_retries.py:22: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_supports_simple_linear_retries ______________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_simple_linear_retries( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support simple linear retries""" calls = 0 async def the_task( a: str, b: str = "b", retry: Retry = Retry(attempts=3), ) -> None: assert a == "a" assert b == "c" assert retry is not None nonlocal calls calls += 1 assert retry.attempts == 3 assert retry.attempt == calls raise Exception("Failed") > await docket.add(the_task)("a", b="c") tests/fundamentals/test_retries.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_supports_simple_linear_retries_with_delay ________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_simple_linear_retries_with_delay( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support simple linear retries with a delay""" calls = 0 async def the_task( a: str, b: str = "b", retry: Retry = Retry(attempts=3, delay=timedelta(milliseconds=100)), ) -> None: assert a == "a" assert b == "c" assert retry is not None nonlocal calls calls += 1 assert retry.attempts == 3 assert retry.attempt == calls raise Exception("Failed") > await docket.add(the_task)("a", b="c") tests/fundamentals/test_retries.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_supports_infinite_retries ________________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_infinite_retries( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support infinite retries (None for attempts)""" calls = 0 async def the_task( a: str, b: str = "b", retry: Retry = Retry(attempts=None), ) -> None: assert a == "a" assert b == "c" assert retry is not None assert retry.attempts is None nonlocal calls calls += 1 assert retry.attempt == calls if calls < 3: raise Exception("Failed") > await docket.add(the_task)("a", b="c") tests/fundamentals/test_retries.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_supports_exponential_backoff_retries ___________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_exponential_backoff_retries( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support exponential backoff retries""" calls = 0 async def the_task( a: str, b: str = "b", retry: Retry = ExponentialRetry( attempts=5, minimum_delay=timedelta(milliseconds=25), maximum_delay=timedelta(milliseconds=1000), ), ) -> None: assert a == "a" assert b == "c" assert isinstance(retry, ExponentialRetry) nonlocal calls calls += 1 assert retry.attempts == 5 assert retry.attempt == calls raise Exception("Failed") > await docket.add(the_task)("a", b="c") tests/fundamentals/test_retries.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________ test_supports_exponential_backoff_retries_under_maximum_delay _________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_supports_exponential_backoff_retries_under_maximum_delay( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """Exponential backoff should cap delays at the configured maximum.""" calls = 0 async def the_task( a: str, b: str = "b", retry: Retry = ExponentialRetry( attempts=5, minimum_delay=timedelta(milliseconds=25), maximum_delay=timedelta(milliseconds=100), ), ) -> None: assert a == "a" assert b == "c" assert isinstance(retry, ExponentialRetry) nonlocal calls calls += 1 assert retry.attempts == 5 assert retry.attempt == calls raise Exception("Failed") > await docket.add(the_task)("a", b="c") tests/fundamentals/test_retries.py:203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_immediate_task_execution _________________________ docket = worker = the_task = async def test_immediate_task_execution( docket: Docket, worker: Worker, the_task: AsyncMock ): """docket should execute a task immediately.""" > await docket.add(the_task)("a", "b", c="c") tests/fundamentals/test_scheduling.py:16: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_immediate_task_execution_by_name _____________________ docket = worker = the_task = async def test_immediate_task_execution_by_name( docket: Docket, worker: Worker, the_task: AsyncMock ): """docket should execute a task immediately by name.""" docket.register(the_task) > await docket.add("the_task")("a", "b", c="c") tests/fundamentals/test_scheduling.py:30: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_scheduled_execution ___________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_scheduled_execution( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket should execute a task at a specific time.""" when = now() + timedelta(milliseconds=100) > await docket.add(the_task, when)("a", "b", c="c") tests/fundamentals/test_scheduling.py:43: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_rescheduling_later ____________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_rescheduling_later( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket should allow for rescheduling a task for later""" key = f"my-cool-task:{uuid4()}" soon = now() + timedelta(milliseconds=10) > await docket.add(the_task, soon, key=key)("a", "b", c="c") tests/fundamentals/test_scheduling.py:60: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_rescheduling_earlier ___________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_rescheduling_earlier( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket should allow for rescheduling a task for earlier""" key = f"my-cool-task:{uuid4()}" soon = now() + timedelta(milliseconds=100) > await docket.add(the_task, soon, key)("a", "b", c="c") tests/fundamentals/test_scheduling.py:82: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_rescheduling_by_name ___________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_rescheduling_by_name( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket should allow for rescheduling a task for later""" key = f"my-cool-task:{uuid4()}" soon = now() + timedelta(milliseconds=100) > await docket.add(the_task, soon, key=key)("a", "b", c="c") tests/fundamentals/test_scheduling.py:104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_replace_without_existing_task_acts_like_add _______________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_replace_without_existing_task_acts_like_add( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """docket.replace() on a non-existent key should schedule the task like add()""" key = f"my-cool-task:{uuid4()}" # Replace without prior add - should just schedule the task later = now() + timedelta(milliseconds=100) > await docket.replace(the_task, later, key=key)("b", "c", c="d") tests/fundamentals/test_scheduling.py:125: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:469: in scheduler await execution.schedule(replace=True) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_self_perpetuating_immediate_tasks ____________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_self_perpetuating_immediate_tasks( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support self-perpetuating tasks""" calls: dict[str, list[int]] = { "first": [], "second": [], } async def the_task(start: int, iteration: int, key: str = TaskKey()): calls[key].append(start + iteration) if iteration < 3: # Use replace() for self-perpetuating to allow rescheduling while running await docket.replace(the_task, now(), key=key)(start, iteration + 1) > await docket.add(the_task, key="first")(10, 1) tests/fundamentals/test_self_perpetuation.py:25: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_self_perpetuating_scheduled_tasks ____________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_self_perpetuating_scheduled_tasks( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support self-perpetuating tasks""" calls: dict[str, list[int]] = { "first": [], "second": [], } async def the_task(start: int, iteration: int, key: str = TaskKey()): calls[key].append(start + iteration) if iteration < 3: soon = now() + timedelta(milliseconds=100) # Use replace() for self-perpetuating to allow rescheduling while running await docket.replace(the_task, key=key, when=soon)(start, iteration + 1) > await docket.add(the_task, key="first")(10, 1) tests/fundamentals/test_self_perpetuation.py:51: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_infinitely_self_perpetuating_tasks ____________________ docket = worker = now = functools.partial(, datetime.timezone.utc) async def test_infinitely_self_perpetuating_tasks( docket: Docket, worker: Worker, now: Callable[[], datetime] ): """docket should support testing use cases for infinitely self-perpetuating tasks""" calls: dict[str, list[int]] = { "first": [], "second": [], "unaffected": [], } async def the_task(start: int, iteration: int, key: str = TaskKey()): calls[key].append(start + iteration) soon = now() + timedelta(milliseconds=100) # Use replace() for self-perpetuating to allow rescheduling while running await docket.replace(the_task, key=key, when=soon)(start, iteration + 1) async def unaffected_task(start: int, iteration: int, key: str = TaskKey()): calls[key].append(start + iteration) if iteration < 3: # Use replace() for self-perpetuating to allow rescheduling while running await docket.replace(unaffected_task, now(), key=key)(start, iteration + 1) > await docket.add(the_task, key="first")(10, 1) tests/fundamentals/test_self_perpetuation.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_shared_dependency_is_initialized_once __________________ docket = worker = async def test_shared_dependency_is_initialized_once(docket: Docket, worker: Worker): """A Shared dependency initializes once at worker startup, not per-task.""" init_count = 0 @asynccontextmanager async def create_resource() -> AsyncGenerator[str, None]: nonlocal init_count init_count += 1 yield f"resource-{init_count}" results: list[str] = [] async def task_using_shared(r: str = Shared(create_resource)): results.append(r) docket.register(task_using_shared) > await docket.add(task_using_shared)() tests/fundamentals/test_shared_dependencies.py:32: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_shared_dependencies_are_same_instance __________________ docket = worker = async def test_shared_dependencies_are_same_instance(docket: Docket, worker: Worker): """Multiple tasks receive the exact same object instance.""" @asynccontextmanager async def create_resource() -> AsyncGenerator[object, None]: yield object() instances: list[object] = [] async def capture_instance(r: object = Shared(create_resource)): instances.append(r) docket.register(capture_instance) > await docket.add(capture_instance)() tests/fundamentals/test_shared_dependencies.py:54: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_shared_identity_is_factory_function ___________________ docket = worker = async def test_shared_identity_is_factory_function(docket: Docket, worker: Worker): """Multiple Shared() calls with the same factory resolve to the same value.""" init_count = 0 @asynccontextmanager async def create_resource() -> AsyncGenerator[str, None]: nonlocal init_count init_count += 1 yield f"resource-{init_count}" results: list[tuple[str, str]] = [] async def task_a(r: str = Shared(create_resource)): results.append(("a", r)) async def task_b(r: str = Shared(create_resource)): results.append(("b", r)) docket.register(task_a) docket.register(task_b) > await docket.add(task_a)() tests/fundamentals/test_shared_dependencies.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_shared_cleanup_on_worker_exit ______________________ docket = async def test_shared_cleanup_on_worker_exit(docket: Docket): """Shared resources are cleaned up when the worker exits.""" stages: list[str] = [] @asynccontextmanager async def create_resource() -> AsyncGenerator[str, None]: stages.append("startup") yield "resource" stages.append("shutdown") async def task_using_shared(r: str = Shared(create_resource)): stages.append("task-ran") async with Worker( docket, minimum_check_interval=timedelta(milliseconds=5) ) as worker: docket.register(task_using_shared) > await docket.add(task_using_shared)() tests/fundamentals/test_shared_dependencies.py:108: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_shared_depending_on_shared ________________________ docket = worker = async def test_shared_depending_on_shared(docket: Docket, worker: Worker): """A Shared dependency can depend on another Shared dependency.""" @asynccontextmanager async def create_config() -> AsyncGenerator[dict[str, str], None]: yield {"db_url": "postgres://localhost/test"} @asynccontextmanager async def create_pool( cfg: dict[str, str] = Shared(create_config), ) -> AsyncGenerator[str, None]: yield f"pool({cfg['db_url']})" results: list[str] = [] async def task_using_pool(p: str = Shared(create_pool)): results.append(p) docket.register(task_using_pool) > await docket.add(task_using_pool)() tests/fundamentals/test_shared_dependencies.py:135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_shared_depending_on_depends _______________________ docket = worker = async def test_shared_depending_on_depends(docket: Docket, worker: Worker): """A Shared can use Depends(), resolved once at worker scope.""" call_count = 0 def get_connection_string() -> str: nonlocal call_count call_count += 1 return f"postgres://localhost/db{call_count}" @asynccontextmanager async def create_pool( url: str = Depends(get_connection_string), ) -> AsyncGenerator[str, None]: yield f"pool({url})" results: list[str] = [] async def task_using_pool(p: str = Shared(create_pool)): results.append(p) docket.register(task_using_pool) > await docket.add(task_using_pool)() tests/fundamentals/test_shared_dependencies.py:163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_shared_can_access_current_docket_and_worker _______________ docket = async def test_shared_can_access_current_docket_and_worker(docket: Docket): """Shared dependencies can use CurrentDocket and CurrentWorker.""" captured: dict[str, str] = {} @asynccontextmanager async def create_resource( d: Docket = CurrentDocket(), w: Worker = CurrentWorker(), ) -> AsyncGenerator[str, None]: captured["docket_name"] = d.name captured["worker_name"] = w.name yield "ready" async def task_using_shared(r: str = Shared(create_resource)): pass async with Worker( docket, name="test-worker", minimum_check_interval=timedelta(milliseconds=5) ) as worker: docket.register(task_using_shared) > await docket.add(task_using_shared)() tests/fundamentals/test_shared_dependencies.py:194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_late_registered_task_with_new_shared ___________________ docket = async def test_late_registered_task_with_new_shared(docket: Docket): """A task registered after worker starts can introduce new Shared dependencies.""" init_order: list[str] = [] @asynccontextmanager async def create_early_resource() -> AsyncGenerator[str, None]: init_order.append("early") yield "early-resource" @asynccontextmanager async def create_late_resource() -> AsyncGenerator[str, None]: init_order.append("late") yield "late-resource" results: list[str] = [] async def early_task(r: str = Shared(create_early_resource)): results.append(r) async def late_task(r: str = Shared(create_late_resource)): results.append(r) async with Worker( docket, minimum_check_interval=timedelta(milliseconds=5) ) as worker: docket.register(early_task) > await docket.add(early_task)() tests/fundamentals/test_shared_dependencies.py:227: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_multiple_shared_cleanup_order ______________________ docket = async def test_multiple_shared_cleanup_order(docket: Docket): """Multiple Shared dependencies clean up in reverse initialization order.""" order: list[str] = [] @asynccontextmanager async def create_first() -> AsyncGenerator[str, None]: order.append("first:start") yield "first" order.append("first:stop") @asynccontextmanager async def create_second( f: str = Shared(create_first), ) -> AsyncGenerator[str, None]: order.append("second:start") yield f"second(depends on {f})" order.append("second:stop") async def task_using_both( f: str = Shared(create_first), s: str = Shared(create_second), ): order.append("task-ran") async with Worker( docket, minimum_check_interval=timedelta(milliseconds=5) ) as worker: docket.register(task_using_both) > await docket.add(task_using_both)() tests/fundamentals/test_shared_dependencies.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_shared_cleanup_on_init_failure ______________________ docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff96673620> async def test_shared_cleanup_on_init_failure( docket: Docket, caplog: pytest.LogCaptureFixture ): """If a Shared fails to initialize, earlier ones still clean up on worker exit. Since Shared dependencies are resolved lazily during task execution, init failures are handled as task failures (the task is marked failed but the worker continues). The key guarantee is that any Shared that DID initialize gets properly cleaned up when the worker exits. The error also appears in logs. """ cleanup_called = False @asynccontextmanager async def create_good() -> AsyncGenerator[str, None]: try: yield "good" finally: nonlocal cleanup_called cleanup_called = True @asynccontextmanager async def create_bad(g: str = Shared(create_good)) -> AsyncGenerator[str, None]: raise ValueError("🦆 QUACK! The rubber duck factory exploded! 🦆") yield # pragma: no cover async def task_using_bad(b: str = Shared(create_bad)): ... with caplog.at_level(logging.ERROR): async with Worker( docket, minimum_check_interval=timedelta(milliseconds=5) ) as worker: docket.register(task_using_bad) > await docket.add(task_using_bad)() tests/fundamentals/test_shared_dependencies.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_shared_async_function_factory ______________________ docket = worker = async def test_shared_async_function_factory(docket: Docket, worker: Worker): """Shared can use an async function that returns a value (not a context manager).""" init_count = 0 async def load_config() -> dict[str, str]: nonlocal init_count init_count += 1 return {"api_url": "https://api.example.com", "version": f"v{init_count}"} results: list[dict[str, str]] = [] async def task_using_config(config: dict[str, str] = Shared(load_config)): results.append(config) docket.register(task_using_config) > await docket.add(task_using_config)() tests/fundamentals/test_shared_dependencies.py:338: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_shared_sync_function_factory _______________________ docket = worker = async def test_shared_sync_function_factory(docket: Docket, worker: Worker): """Shared can use a sync function that returns a value (not a context manager).""" init_count = 0 def create_config() -> dict[str, str]: nonlocal init_count init_count += 1 return {"db_host": "localhost", "init": str(init_count)} results: list[dict[str, str]] = [] async def task_using_config(config: dict[str, str] = Shared(create_config)): results.append(config) docket.register(task_using_config) > await docket.add(task_using_config)() tests/fundamentals/test_shared_dependencies.py:364: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_shared_sync_context_manager_factory ___________________ docket = async def test_shared_sync_context_manager_factory(docket: Docket): """Shared can use a sync context manager with cleanup.""" stages: list[str] = [] @contextmanager def create_resource() -> Generator[str, None, None]: stages.append("startup") yield "sync-resource" stages.append("shutdown") async def task_using_resource(r: str = Shared(create_resource)): stages.append(f"task-ran:{r}") async with Worker( docket, minimum_check_interval=timedelta(milliseconds=5) ) as worker: docket.register(task_using_resource) > await docket.add(task_using_resource)() tests/fundamentals/test_shared_dependencies.py:391: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_striking_entire_tasks __________________________ docket = worker = the_task = another_task = key_leak_checker = async def test_striking_entire_tasks( docket: Docket, worker: Worker, the_task: AsyncMock, another_task: AsyncMock, key_leak_checker: KeyCountChecker, ): """docket should support striking and restoring entire tasks""" > execution1 = await docket.add(the_task)("a", b="c") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/fundamentals/test_striking.py:18: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_striking_entire_parameters ________________________ docket = worker = the_task = another_task = key_leak_checker = async def test_striking_entire_parameters( docket: Docket, worker: Worker, the_task: AsyncMock, another_task: AsyncMock, key_leak_checker: KeyCountChecker, ): """docket should support striking and restoring entire parameters""" # Struck tasks remain without TTL so they can be restored key_leak_checker.add_pattern_exemption(f"{docket.prefix}:runs:*") > await docket.add(the_task)(customer_id="123", order_id="456") tests/fundamentals/test_striking.py:57: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_striking_tasks_for_specific_parameters __________________ docket = worker = the_task = another_task = key_leak_checker = async def test_striking_tasks_for_specific_parameters( docket: Docket, worker: Worker, the_task: AsyncMock, another_task: AsyncMock, key_leak_checker: KeyCountChecker, ): """docket should support striking and restoring tasks for specific parameters""" # Struck tasks remain without TTL so they can be restored key_leak_checker.add_pattern_exemption(f"{docket.prefix}:runs:*") > await docket.add(the_task)("a", b=1) tests/fundamentals/test_striking.py:161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_sync_function_dependencies ________________________ docket = worker = async def test_sync_function_dependencies(docket: Docket, worker: Worker): """A task can depend on the return value of sync functions""" def dependency_one() -> str: return f"one-{uuid4()}" def dependency_two() -> str: return f"two-{uuid4()}" called = 0 async def dependent_task( one_a: str = Depends(dependency_one), one_b: str = Depends(dependency_one), two: str = Depends(dependency_two), ): assert one_a.startswith("one-") assert one_b == one_a assert two.startswith("two-") nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_sync_dependencies.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_sync_contextual_dependencies _______________________ docket = worker = async def test_sync_contextual_dependencies(docket: Docket, worker: Worker): """A task can depend on the return value of sync context managers""" stages: list[str] = [] @contextmanager def dependency_one() -> Generator[str, None, None]: stages.append("one-before") yield f"one-{uuid4()}" stages.append("one-after") def dependency_two() -> str: return f"two-{uuid4()}" called = 0 async def dependent_task( one_a: str = Depends(dependency_one), one_b: str = Depends(dependency_one), two: str = Depends(dependency_two), ): assert one_a.startswith("one-") assert one_b == one_a assert two.startswith("two-") nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_sync_dependencies.py:70: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_mixed_sync_and_async_dependencies ____________________ docket = worker = async def test_mixed_sync_and_async_dependencies(docket: Docket, worker: Worker): """A task can depend on both sync and async dependencies""" def sync_dependency() -> str: return f"sync-{uuid4()}" async def async_dependency() -> str: return f"async-{uuid4()}" called = 0 async def dependent_task( sync_val: str = Depends(sync_dependency), async_val: str = Depends(async_dependency), ): assert sync_val.startswith("sync-") assert async_val.startswith("async-") nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_sync_dependencies.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_sync_dependencies_of_dependencies ____________________ docket = worker = async def test_sync_dependencies_of_dependencies(docket: Docket, worker: Worker): """A sync task dependency can depend on other sync dependencies""" counter = 0 def dependency_one() -> list[str]: nonlocal counter counter += 1 return [f"one-{counter}"] def dependency_two(my_one: list[str] = Depends(dependency_one)) -> list[str]: nonlocal counter counter += 1 return my_one + [f"two-{counter}"] def dependency_three( my_one: list[str] = Depends(dependency_one), my_two: list[str] = Depends(dependency_two), ) -> list[str]: nonlocal counter counter += 1 return my_one + my_two + [f"three-{counter}"] async def dependent_task( one_a: list[str] = Depends(dependency_one), one_b: list[str] = Depends(dependency_one), two: list[str] = Depends(dependency_two), three: list[str] = Depends(dependency_three), ): assert one_a is one_b assert one_a == ["one-1"] assert two == ["one-1", "two-2"] assert three == ["one-1", "two-2", "three-3"] > await docket.add(dependent_task)() tests/fundamentals/test_sync_dependencies.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_sync_dependencies_can_ask_for_docket_dependencies ____________ docket = worker = async def test_sync_dependencies_can_ask_for_docket_dependencies( docket: Docket, worker: Worker ): """A sync task dependency can ask for a docket dependency""" called = 0 def dependency_one(this_docket: Docket = CurrentDocket()) -> str: assert this_docket is docket nonlocal called called += 1 return f"one-{called}" def dependency_two( this_worker: Worker = CurrentWorker(), one: str = Depends(dependency_one), ) -> str: assert this_worker is worker assert one == "one-1" nonlocal called called += 1 return f"two-{called}" async def dependent_task( one: str = Depends(dependency_one), two: str = Depends(dependency_two), this_docket: Docket = CurrentDocket(), this_worker: Worker = CurrentWorker(), ): assert one == "one-1" assert two == "two-2" assert this_docket is docket assert this_worker is worker nonlocal called called += 1 > await docket.add(dependent_task)() tests/fundamentals/test_sync_dependencies.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_mixed_sync_async_nested_dependencies ___________________ docket = worker = async def test_mixed_sync_async_nested_dependencies(docket: Docket, worker: Worker): """Dependencies can mix sync and async at different nesting levels""" counter = 0 def sync_base() -> int: nonlocal counter counter += 1 return counter async def async_multiplier(base: int = Depends(sync_base)) -> int: nonlocal counter counter += 1 return base * 10 def sync_adder(multiplied: int = Depends(async_multiplier)) -> int: nonlocal counter counter += 1 return multiplied + 5 async def dependent_task(result: int = Depends(sync_adder)): # 1 * 10 + 5 = 15 assert result == 15 > await docket.add(dependent_task)() tests/fundamentals/test_sync_dependencies.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________________ test_simple_timeout ______________________________ docket = worker = async def test_simple_timeout(docket: Docket, worker: Worker): """A task with a timeout completes normally when it finishes before the limit.""" remaining_at_end: timedelta | None = None async def task_with_timeout( timeout: Timeout = Timeout(timedelta(milliseconds=500)), ): await asyncio.sleep(0.01) nonlocal remaining_at_end remaining_at_end = timeout.remaining() > await docket.add(task_with_timeout)() tests/fundamentals/test_timeouts.py:22: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_simple_timeout_cancels_tasks _______________________ docket = worker = async def test_simple_timeout_cancels_tasks(docket: Docket, worker: Worker): """A task can be scheduled with a timeout and are cancelled""" called = False async def task_with_timeout( timeout: Timeout = Timeout(timedelta(milliseconds=100)), ): try: await asyncio.sleep(5) except asyncio.CancelledError: nonlocal called called = True > await docket.add(task_with_timeout)() tests/fundamentals/test_timeouts.py:45: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_timeout_can_be_extended _________________________ docket = worker = async def test_timeout_can_be_extended(docket: Docket, worker: Worker): """A task can be scheduled with a timeout and extend themselves""" called = False async def task_with_timeout( timeout: Timeout = Timeout(timedelta(milliseconds=100)), ): await asyncio.sleep(0.05) timeout.extend(timedelta(milliseconds=200)) try: await asyncio.sleep(5) except asyncio.CancelledError: nonlocal called called = True > await docket.add(task_with_timeout)() tests/fundamentals/test_timeouts.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_timeout_extends_by_base_by_default ____________________ docket = worker = async def test_timeout_extends_by_base_by_default(docket: Docket, worker: Worker): """A task can be scheduled with a timeout and extend itself by the base timeout""" called = False async def task_with_timeout( timeout: Timeout = Timeout(timedelta(milliseconds=100)), ): await asyncio.sleep(0.05) timeout.extend() # defaults to the base timeout try: await asyncio.sleep(5) except asyncio.CancelledError: nonlocal called called = True > await docket.add(task_with_timeout)() tests/fundamentals/test_timeouts.py:107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_timeout_is_compatible_with_retry _____________________ docket = worker = async def test_timeout_is_compatible_with_retry(docket: Docket, worker: Worker): """A task that times out can be retried""" successes: list[int] = [] async def task_with_timeout( retry: Retry = Retry(attempts=3), _timeout: Timeout = Timeout(timedelta(milliseconds=100)), ): if retry.attempt == 1: await asyncio.sleep(1) successes.append(retry.attempt) > await docket.add(task_with_timeout)() tests/fundamentals/test_timeouts.py:134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_adding_a_task_increments_counter _____________________ docket = the_task = task_labels = {'docket.name': 'test-docket-e09e1595-eb86-45a0-95b4-a3e7cca1d82f', 'docket.task': 'the_task'} TASKS_ADDED = TASKS_REPLACED = TASKS_SCHEDULED = async def test_adding_a_task_increments_counter( docket: Docket, the_task: AsyncMock, task_labels: dict[str, str], TASKS_ADDED: Mock, TASKS_REPLACED: Mock, TASKS_SCHEDULED: Mock, ): """Should increment the appropriate counters when adding a task.""" > await docket.add(the_task)() tests/instrumentation/test_counters.py:57: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_replacing_a_task_increments_counter ___________________ docket = the_task = task_labels = {'docket.name': 'test-docket-b27bea9d-a2b6-438d-9282-70a33ed12ea9', 'docket.task': 'the_task'} TASKS_ADDED = TASKS_REPLACED = TASKS_SCHEDULED = async def test_replacing_a_task_increments_counter( docket: Docket, the_task: AsyncMock, task_labels: dict[str, str], TASKS_ADDED: Mock, TASKS_REPLACED: Mock, TASKS_SCHEDULED: Mock, ): """Should increment the appropriate counters when replacing a task.""" from datetime import datetime, timezone when = datetime.now(timezone.utc) + timedelta(minutes=5) key = "test-replace-key" > await docket.replace(the_task, when, key)() tests/instrumentation/test_counters.py:78: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:469: in scheduler await execution.schedule(replace=True) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_cancelling_a_task_increments_counter ___________________ docket = the_task = TASKS_CANCELLED = async def test_cancelling_a_task_increments_counter( docket: Docket, the_task: AsyncMock, TASKS_CANCELLED: Mock, ): """Should increment the TASKS_CANCELLED counter when cancelling a task.""" from datetime import datetime, timezone when = datetime.now(timezone.utc) + timedelta(minutes=5) key = "test-cancel-key" > await docket.add(the_task, when=when, key=key)() tests/instrumentation/test_counters.py:103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_worker_execution_increments_task_counters ________________ docket = worker = the_task = worker_labels = {'docket.name': 'test-docket-e8d432ac-3c23-48f5-8bc4-3b835caea059', 'docket.task': 'the_task', 'docket.worker': 'bde710d1552f45a3b346e4ebac757aed#574'} TASKS_STARTED = TASKS_COMPLETED = TASKS_SUCCEEDED = TASKS_FAILED = TASKS_RETRIED = TASKS_REDELIVERED = async def test_worker_execution_increments_task_counters( docket: Docket, worker: Worker, the_task: AsyncMock, worker_labels: dict[str, str], TASKS_STARTED: Mock, TASKS_COMPLETED: Mock, TASKS_SUCCEEDED: Mock, TASKS_FAILED: Mock, TASKS_RETRIED: Mock, TASKS_REDELIVERED: Mock, ): """Should increment the appropriate task counters when a worker executes a task.""" > await docket.add(the_task)() tests/instrumentation/test_counters.py:191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_failed_task_increments_failure_counter __________________ docket = worker = the_task = worker_labels = {'docket.name': 'test-docket-854eaabe-6f34-4bf9-b650-26b1517a8c97', 'docket.task': 'the_task', 'docket.worker': 'bde710d1552f45a3b346e4ebac757aed#574'} TASKS_STARTED = TASKS_COMPLETED = TASKS_SUCCEEDED = TASKS_FAILED = TASKS_RETRIED = TASKS_REDELIVERED = async def test_failed_task_increments_failure_counter( docket: Docket, worker: Worker, the_task: AsyncMock, worker_labels: dict[str, str], TASKS_STARTED: Mock, TASKS_COMPLETED: Mock, TASKS_SUCCEEDED: Mock, TASKS_FAILED: Mock, TASKS_RETRIED: Mock, TASKS_REDELIVERED: Mock, ): """Should increment the TASKS_FAILED counter when a task fails.""" the_task.side_effect = ValueError("Womp") > await docket.add(the_task)() tests/instrumentation/test_counters.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_retried_task_increments_retry_counter __________________ docket = worker = TASKS_STARTED = TASKS_COMPLETED = TASKS_SUCCEEDED = TASKS_FAILED = TASKS_RETRIED = TASKS_REDELIVERED = async def test_retried_task_increments_retry_counter( docket: Docket, worker: Worker, TASKS_STARTED: Mock, TASKS_COMPLETED: Mock, TASKS_SUCCEEDED: Mock, TASKS_FAILED: Mock, TASKS_RETRIED: Mock, TASKS_REDELIVERED: Mock, ): """Should increment the TASKS_RETRIED counter when a task is retried.""" async def the_task(retry: Retry = Retry(attempts=2)): # noqa: ARG001 raise ValueError("First attempt fails") > await docket.add(the_task)() tests/instrumentation/test_counters.py:245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_exhausted_retried_task_increments_retry_counter _____________ docket = worker = worker_labels = {'docket.name': 'test-docket-07696aee-a252-492e-b181-957e3e47de9a', 'docket.task': 'the_task', 'docket.worker': 'bde710d1552f45a3b346e4ebac757aed#574'} TASKS_STARTED = TASKS_COMPLETED = TASKS_SUCCEEDED = TASKS_FAILED = TASKS_RETRIED = TASKS_REDELIVERED = async def test_exhausted_retried_task_increments_retry_counter( docket: Docket, worker: Worker, worker_labels: dict[str, str], TASKS_STARTED: Mock, TASKS_COMPLETED: Mock, TASKS_SUCCEEDED: Mock, TASKS_FAILED: Mock, TASKS_RETRIED: Mock, TASKS_REDELIVERED: Mock, ): """Should increment the appropriate counters when retries are exhausted.""" async def the_task(retry: Retry = Retry(attempts=1)): # noqa: ARG001 raise ValueError("First attempt fails") > await docket.add(the_task)() tests/instrumentation/test_counters.py:273: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_retried_task_metric_uses_bounded_labels _________________ docket = worker = TASKS_RETRIED = async def test_retried_task_metric_uses_bounded_labels( docket: Docket, worker: Worker, TASKS_RETRIED: Mock, ): """TASKS_RETRIED should only use bounded-cardinality labels (not task keys).""" async def the_task(retry: Retry = Retry(attempts=2)): # noqa: ARG001 raise ValueError("Always fails") > await docket.add(the_task)() tests/instrumentation/test_counters.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_perpetuated_task_metric_uses_bounded_labels _______________ docket = worker = TASKS_PERPETUATED = async def test_perpetuated_task_metric_uses_bounded_labels( docket: Docket, worker: Worker, TASKS_PERPETUATED: Mock, ): """TASKS_PERPETUATED should only use bounded-cardinality labels (not task keys).""" async def the_task( perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), # noqa: ARG001 ): pass > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/instrumentation/test_counters.py:321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_redelivered_tasks_increment_redelivered_counter _____________ docket = TASKS_REDELIVERED = async def test_redelivered_tasks_increment_redelivered_counter( docket: Docket, TASKS_REDELIVERED: Mock, ): """Should increment the TASKS_REDELIVERED counter for redelivered tasks.""" async def test_task(): await asyncio.sleep(0.01) > await docket.add(test_task)() tests/instrumentation/test_counters.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_superseded_task_increments_superseded_counter ______________ docket = worker = TASKS_STARTED = TASKS_COMPLETED = TASKS_RUNNING = TASKS_SUPERSEDED = async def test_superseded_task_increments_superseded_counter( docket: Docket, worker: Worker, TASKS_STARTED: Mock, TASKS_COMPLETED: Mock, TASKS_RUNNING: Mock, TASKS_SUPERSEDED: Mock, ): """Superseded tasks increment TASKS_SUPERSEDED but not lifecycle metrics. When claim() detects that a task has been superseded by a newer generation, the worker records TASKS_SUPERSEDED with docket.where=worker, but doesn't touch TASKS_STARTED, TASKS_RUNNING, or TASKS_COMPLETED. """ async def superseded_task(): pass # pragma: no cover > await docket.add(superseded_task, key="metrics-superseded")() tests/instrumentation/test_counters.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_replaced_task_only_counts_replacement __________________ docket = worker = TASKS_STARTED = TASKS_COMPLETED = TASKS_RUNNING = TASKS_SUCCEEDED = TASKS_SUPERSEDED = async def test_replaced_task_only_counts_replacement( docket: Docket, worker: Worker, TASKS_STARTED: Mock, TASKS_COMPLETED: Mock, TASKS_RUNNING: Mock, TASKS_SUCCEEDED: Mock, TASKS_SUPERSEDED: Mock, ): """When a task is replaced before execution, only the replacement runs. In the normal case, replace() successfully deletes the old stream message via XDEL, so the worker only sees the replacement. No supersession occurs because the stale message is already gone. """ async def replaceable_task(): pass > await docket.add(replaceable_task, key="metrics-replace")() tests/instrumentation/test_counters.py:443: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_task_duration_is_measured ________________________ docket = worker = worker_labels = {'docket.name': 'test-docket-a36328c1-8cb9-4fb7-89c9-0c3db1d6b118', 'docket.task': 'the_task', 'docket.worker': 'bde710d1552f45a3b346e4ebac757aed#574'} TASK_DURATION = async def test_task_duration_is_measured( docket: Docket, worker: Worker, worker_labels: dict[str, str], TASK_DURATION: Mock ): """Should record the duration of task execution in the TASK_DURATION histogram.""" inner_elapsed = 0.0 async def the_task(): nonlocal inner_elapsed start = time.time() await asyncio.sleep(0.1) inner_elapsed = time.time() - start > await docket.add(the_task)() tests/instrumentation/test_export.py:52: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_task_punctuality_is_measured _______________________ docket = worker = the_task = worker_labels = {'docket.name': 'test-docket-aa023f10-5028-4667-9bc3-6b18d850ffff', 'docket.task': 'the_task', 'docket.worker': 'bde710d1552f45a3b346e4ebac757aed#574'} TASK_PUNCTUALITY = @pytest.mark.skipif( sys.platform == "win32", reason="Timing-sensitive: unreliable on Windows" ) async def test_task_punctuality_is_measured( docket: Docket, worker: Worker, the_task: AsyncMock, worker_labels: dict[str, str], TASK_PUNCTUALITY: Mock, ): """Should record TASK_PUNCTUALITY values for scheduled tasks.""" when = datetime.now(timezone.utc) + timedelta(seconds=0.1) > await docket.add(the_task, when=when)() tests/instrumentation/test_export.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_task_running_gauge_is_incremented ____________________ docket = worker = worker_labels = {'docket.name': 'test-docket-80f9868f-4b71-4b04-9138-1f5730affea3', 'docket.task': 'the_task', 'docket.worker': 'bde710d1552f45a3b346e4ebac757aed#574'} TASKS_RUNNING = async def test_task_running_gauge_is_incremented( docket: Docket, worker: Worker, worker_labels: dict[str, str], TASKS_RUNNING: Mock ): """Should increment and decrement the TASKS_RUNNING gauge appropriately.""" inside_task = False async def the_task(): nonlocal inside_task inside_task = True TASKS_RUNNING.assert_called_once_with(1, worker_labels) > await docket.add(the_task)() tests/instrumentation/test_export.py:112: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_worker_publishes_depth_gauges ______________________ docket = docket_labels = {'docket.name': 'test-docket-48538f15-cfe4-4f2d-b3b0-f885d02e90b2'} the_task = QUEUE_DEPTH = SCHEDULE_DEPTH = async def test_worker_publishes_depth_gauges( docket: Docket, docket_labels: dict[str, str], the_task: AsyncMock, QUEUE_DEPTH: Mock, SCHEDULE_DEPTH: Mock, ): """Should publish depth gauges for due and scheduled tasks.""" > await docket.add(the_task)() tests/instrumentation/test_export.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_agenda_scatter_basic ___________________________ docket = agenda = the_task = async def test_agenda_scatter_basic( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should scatter tasks evenly over the specified timeframe.""" docket.register(the_task) # Add 3 tasks to scatter over 60 seconds agenda.add(the_task)("task1") agenda.add(the_task)("task2") agenda.add(the_task)("task3") start_time = datetime.now(timezone.utc) > executions = await agenda.scatter(docket, over=timedelta(seconds=60)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_agenda_scatter_with_start_time ______________________ docket = agenda = the_task = async def test_agenda_scatter_with_start_time( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should scatter tasks starting from a future time.""" docket.register(the_task) agenda.add(the_task)("task1") agenda.add(the_task)("task2") start_time = datetime.now(timezone.utc) + timedelta(minutes=10) > executions = await agenda.scatter( docket, start=start_time, over=timedelta(minutes=20) ) tests/test_agenda.py:87: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_agenda_scatter_with_jitter ________________________ docket = agenda = the_task = async def test_agenda_scatter_with_jitter( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should add random jitter to scheduled times.""" docket.register(the_task) # Add many tasks to verify jitter is applied for i in range(5): agenda.add(the_task)(f"task{i}") start_time = datetime.now(timezone.utc) > executions = await agenda.scatter( docket, over=timedelta(minutes=10), jitter=timedelta(seconds=30) ) tests/test_agenda.py:112: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_agenda_scatter_with_large_jitter _____________________ docket = agenda = the_task = async def test_agenda_scatter_with_large_jitter( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should ensure jittered times never go before start even with large jitter.""" docket.register(the_task) # Add tasks that will be scheduled close to start for i in range(3): agenda.add(the_task)(f"task{i}") start_time = datetime.now(timezone.utc) # Use a very large jitter (5 minutes) on a short window (1 minute) # This could potentially push times before start without our safety check > executions = await agenda.scatter( docket, start=start_time, over=timedelta(minutes=1), jitter=timedelta(minutes=5) ) tests/test_agenda.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_agenda_scatter_single_task ________________________ docket = agenda = the_task = async def test_agenda_scatter_single_task( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should handle scattering a single task.""" docket.register(the_task) agenda.add(the_task)("single") start_time = datetime.now(timezone.utc) > executions = await agenda.scatter(docket, over=timedelta(minutes=10)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_agenda_scatter_heterogeneous_tasks ____________________ docket = agenda = the_task = another_task = async def test_agenda_scatter_heterogeneous_tasks( docket: Docket, agenda: Agenda, the_task: AsyncMock, another_task: AsyncMock ): """Should scatter different types of tasks.""" docket.register(the_task) docket.register(another_task) agenda.add(the_task)("task1", key="value1") agenda.add(another_task)(42, flag=True) agenda.add(the_task)("task2") agenda.add(another_task)(99) > executions = await agenda.scatter(docket, over=timedelta(seconds=90)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:189: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_agenda_scatter_preserves_order ______________________ docket = agenda = the_task = async def test_agenda_scatter_preserves_order( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should preserve task order when scattering.""" docket.register(the_task) for i in range(10): agenda.add(the_task)(f"task{i}") > executions = await agenda.scatter(docket, over=timedelta(minutes=10)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_agenda_reusability ____________________________ docket = agenda = the_task = async def test_agenda_reusability(docket: Docket, agenda: Agenda, the_task: AsyncMock): """Agenda should be reusable for multiple scatter operations.""" docket.register(the_task) agenda.add(the_task)("task1") agenda.add(the_task)("task2") # First scatter > executions1 = await agenda.scatter(docket, over=timedelta(seconds=60)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_agenda_scatter_with_task_by_name _____________________ docket = agenda = the_task = async def test_agenda_scatter_with_task_by_name( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should support adding tasks by name.""" docket.register(the_task) # Add task by its registered name agenda.add("the_task")("arg1", key="value") > executions = await agenda.scatter(docket, over=timedelta(seconds=60)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_agenda_scatter_partial_scheduling_behavior ________________ docket = agenda = the_task = another_task = async def test_agenda_scatter_partial_scheduling_behavior( docket: Docket, agenda: Agenda, the_task: AsyncMock, another_task: AsyncMock ): """Documents the partial scheduling behavior when failures occur.""" docket.register(the_task) # Don't register another_task initially # Test validation failure - unregistered task fails fast before any scheduling agenda.add(the_task)("task1") agenda.add(the_task)("task2") agenda.add("unregistered_task")("will_fail") # This will fail validation agenda.add(the_task)("task3") # The scatter should fail during validation before scheduling anything with pytest.raises(KeyError, match="Task 'unregistered_task' is not registered"): await agenda.scatter(docket, over=timedelta(seconds=60)) # Verify no tasks were scheduled (failed during validation) snapshot = await docket.snapshot() assert len(snapshot.future) == 0 # Test successful case with all registered tasks agenda2 = Agenda() docket.register(another_task) agenda2.add(the_task)("task1") agenda2.add(the_task)("task2") agenda2.add(another_task)("task3") agenda2.add(the_task)("task4") # All tasks should be scheduled successfully > executions = await agenda2.scatter(docket, over=timedelta(seconds=60)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:336: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________ test_agenda_scatter_auto_registers_unregistered_functions ___________ docket = agenda = the_task = async def test_agenda_scatter_auto_registers_unregistered_functions( docket: Docket, agenda: Agenda, the_task: AsyncMock ): """Should automatically register task functions that aren't already registered.""" # the_task is NOT registered yet assert the_task not in docket.tasks.values() agenda.add(the_task)("task1") agenda.add(the_task)("task2") # scatter should auto-register the task > executions = await agenda.scatter(docket, over=timedelta(seconds=30)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_agenda.py:383: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/agenda.py:200: in scatter await scheduler(*execution.args, **execution.kwargs) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_cancel_running_task ___________________________ docket = worker = async def test_cancel_running_task(docket: Docket, worker: Worker): """A running task can be cancelled via docket.cancel().""" started = asyncio.Event() cancelled = asyncio.Event() async def slow_task(): started.set() try: await asyncio.sleep(60) except asyncio.CancelledError: cancelled.set() raise docket.register(slow_task) > execution = await docket.add(slow_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_cancel_running_task_state ________________________ docket = worker = async def test_cancel_running_task_state(docket: Docket, worker: Worker): """A cancelled running task transitions to CANCELLED state.""" started = asyncio.Event() async def slow_task(): started.set() await asyncio.sleep(60) docket.register(slow_task) > execution = await docket.add(slow_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:53: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_cancel_running_task_with_cleanup _____________________ docket = worker = async def test_cancel_running_task_with_cleanup(docket: Docket, worker: Worker): """A task can catch CancelledError to perform cleanup.""" started = asyncio.Event() cleanup_done = asyncio.Event() async def task_with_cleanup(): started.set() try: await asyncio.sleep(60) except asyncio.CancelledError: cleanup_done.set() raise docket.register(task_with_cleanup) > execution = await docket.add(task_with_cleanup)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:84: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_cancel_task_that_ignores_cancellation __________________ docket = worker = async def test_cancel_task_that_ignores_cancellation(docket: Docket, worker: Worker): """A task that catches and swallows CancelledError continues to completion.""" started = asyncio.Event() cancellation_caught = asyncio.Event() completed = asyncio.Event() async def stubborn_task(): started.set() try: await asyncio.sleep(60) except asyncio.CancelledError: cancellation_caught.set() completed.set() docket.register(stubborn_task) > execution = await docket.add(stubborn_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:115: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_cancel_already_completed_is_noop _____________________ docket = worker = async def test_cancel_already_completed_is_noop(docket: Docket, worker: Worker): """Cancelling a task that has already completed is a no-op.""" async def quick_task(): pass docket.register(quick_task) > execution = await docket.add(quick_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_cancel_publishes_state_event _______________________ docket = worker = async def test_cancel_publishes_state_event(docket: Docket, worker: Worker): """Cancelling a running task publishes a CANCELLED state event.""" started = asyncio.Event() state_events: list[StateEvent | ProgressEvent] = [] async def slow_task(): started.set() await asyncio.sleep(60) docket.register(slow_task) > execution = await docket.add(slow_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_cancel_only_affects_running_worker ____________________ docket = worker = second_worker = async def test_cancel_only_affects_running_worker( docket: Docket, worker: Worker, second_worker: Worker ): """A cancellation signal only affects the worker running the task.""" started_on_worker = asyncio.Event() async def slow_task(): started_on_worker.set() await asyncio.sleep(60) docket.register(slow_task) > execution = await docket.add(slow_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_cancel_running_task_with_zero_execution_ttl _______________ zero_ttl_docket = async def test_cancel_running_task_with_zero_execution_ttl(zero_ttl_docket: Docket): """Cancellation with execution_ttl=0 deletes the execution record immediately.""" started = asyncio.Event() cancelled = asyncio.Event() async def slow_task(): started.set() try: await asyncio.sleep(60) except asyncio.CancelledError: cancelled.set() raise zero_ttl_docket.register(slow_task) > execution = await zero_ttl_docket.add(slow_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_cancelled_task_with_retry_does_not_retry _________________ docket = worker = async def test_cancelled_task_with_retry_does_not_retry(docket: Docket, worker: Worker): """A cancelled task should NOT retry, even if it has a Retry dependency.""" from docket.dependencies import Retry started = asyncio.Event() execution_count = 0 async def retryable_task(retry: Retry = Retry(attempts=3)): nonlocal execution_count execution_count += 1 started.set() await asyncio.sleep(60) docket.register(retryable_task) > execution = await docket.add(retryable_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_cancelled_perpetual_task_does_not_perpetuate _______________ docket = worker = async def test_cancelled_perpetual_task_does_not_perpetuate( docket: Docket, worker: Worker ): """A cancelled Perpetual task should NOT reschedule itself.""" from docket.dependencies import Perpetual started = asyncio.Event() execution_count = 0 async def perpetual_task(perpetual: Perpetual = Perpetual()): nonlocal execution_count execution_count += 1 started.set() await asyncio.sleep(60) docket.register(perpetual_task) > execution = await docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_cancel_running_task_with_timeout _____________________ docket = worker = async def test_cancel_running_task_with_timeout(docket: Docket, worker: Worker): """A running task with Timeout can be cancelled via docket.cancel(). Regression test for bug where _run_function_with_timeout converted ALL CancelledError to TimeoutError, breaking external cancellation. """ from docket.dependencies import Timeout started = asyncio.Event() async def slow_task_with_timeout(timeout: Timeout = Timeout(timedelta(seconds=60))): started.set() await asyncio.sleep(60) docket.register(slow_task_with_timeout) > execution = await docket.add(slow_task_with_timeout)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:366: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________ test_get_result_raises_execution_cancelled_for_cancelled_task _________ docket = worker = async def test_get_result_raises_execution_cancelled_for_cancelled_task( docket: Docket, worker: Worker ): """get_result() should raise ExecutionCancelled for cancelled tasks.""" started = asyncio.Event() async def cancellable_task(): started.set() await asyncio.sleep(60) docket.register(cancellable_task) > execution = await docket.add(cancellable_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_cancellation.py:389: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_sync_function_dependency _________________________ docket = worker = async def test_sync_function_dependency(docket: Docket, worker: Worker): """A task can depend on a synchronous function""" called = False def sync_dependency() -> str: return "sync-value" async def dependent_task(value: str = Depends(sync_dependency)): assert value == "sync-value" nonlocal called called = True > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:27: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_sync_context_manager_dependency _____________________ docket = worker = async def test_sync_context_manager_dependency(docket: Docket, worker: Worker): """A task can depend on a synchronous context manager""" called = False stages: list[str] = [] @contextmanager def sync_cm_dependency(): stages.append("before") yield "sync-cm-value" stages.append("after") async def dependent_task(value: str = Depends(sync_cm_dependency)): assert value == "sync-cm-value" stages.append("during") nonlocal called called = True > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:50: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_mixed_sync_and_async_dependencies ____________________ docket = worker = async def test_mixed_sync_and_async_dependencies(docket: Docket, worker: Worker): """A task can depend on both sync and async dependencies""" called = False def sync_dependency() -> str: return "sync" async def async_dependency() -> str: return "async" @contextmanager def sync_cm(): yield "sync-cm" async def dependent_task( sync_val: str = Depends(sync_dependency), async_val: str = Depends(async_dependency), sync_cm_val: str = Depends(sync_cm), ): assert sync_val == "sync" assert async_val == "async" assert sync_cm_val == "sync-cm" nonlocal called called = True > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:82: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_nested_sync_dependencies _________________________ docket = worker = async def test_nested_sync_dependencies(docket: Docket, worker: Worker): """A sync dependency can depend on another sync dependency""" called = False def base_dependency() -> int: return 10 def derived_dependency(base: int = Depends(base_dependency)) -> int: return base * 2 async def dependent_task(value: int = Depends(derived_dependency)): assert value == 20 nonlocal called called = True > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_sync_dependency_with_docket_context ___________________ docket = worker = async def test_sync_dependency_with_docket_context(docket: Docket, worker: Worker): """A sync dependency can access docket context""" called = False def sync_dep_with_context(d: Docket = CurrentDocket()) -> str: assert d is docket return d.name async def dependent_task(name: str = Depends(sync_dep_with_context)): assert name == docket.name nonlocal called called = True > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:122: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_sync_context_manager_cleanup_on_exception ________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff957baac0> async def test_sync_context_manager_cleanup_on_exception( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """A sync context manager's cleanup runs even when the task fails""" stages: list[str] = [] @contextmanager def sync_cm_with_cleanup(): stages.append("enter") try: yield "value" finally: stages.append("exit") async def failing_task(value: str = Depends(sync_cm_with_cleanup)): stages.append("task") raise ValueError("Task failed") > await docket.add(failing_task)() tests/test_dependencies_advanced.py:146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_sync_dependency_caching _________________________ docket = worker = async def test_sync_dependency_caching(docket: Docket, worker: Worker): """Sync dependencies are cached and only called once per task""" call_count = 0 def counted_dependency() -> str: nonlocal call_count call_count += 1 return f"call-{call_count}" async def dependent_task( val_a: str = Depends(counted_dependency), val_b: str = Depends(counted_dependency), ): # Both should be the same value since it's cached assert val_a == val_b assert val_a == "call-1" > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:172: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_mixed_nested_dependencies ________________________ docket = worker = async def test_mixed_nested_dependencies(docket: Docket, worker: Worker): """Complex nesting with mixed sync and async dependencies""" called = False def sync_base() -> int: return 5 async def async_multiplier(base: int = Depends(sync_base)) -> int: return base * 3 def sync_adder(multiplied: int = Depends(async_multiplier)) -> int: return multiplied + 10 async def dependent_task(result: int = Depends(sync_adder)): # 5 * 3 + 10 = 25 assert result == 25 nonlocal called called = True > await docket.add(dependent_task)() tests/test_dependencies_advanced.py:197: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_contextvar_isolation_between_tasks ____________________ docket = worker = async def test_contextvar_isolation_between_tasks(docket: Docket, worker: Worker): """Contextvars should be isolated between sequential task executions""" executions_seen: list[tuple[str, Execution]] = [] async def first_task(a: str): # Capture the execution context during first task execution = Dependency.execution.get() executions_seen.append(("first", execution)) assert a == "first" async def second_task(b: str): # Capture the execution context during second task execution = Dependency.execution.get() executions_seen.append(("second", execution)) assert b == "second" > await docket.add(first_task)(a="first") tests/test_dependencies_advanced.py:219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_contextvar_cleanup_after_task ______________________ docket = worker = async def test_contextvar_cleanup_after_task(docket: Docket, worker: Worker): """Task-scoped contextvars are reset after task execution completes. Worker-scoped contextvars (Dependency.docket, Dependency.worker) remain set for the entire worker lifetime to support Shared dependencies. """ captured_stack = None captured_cache = None async def capture_task(): nonlocal captured_stack, captured_cache # Capture references during task execution captured_stack = _Depends.stack.get() captured_cache = _Depends.cache.get() > await docket.add(capture_task)() tests/test_dependencies_advanced.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_dependency_cache_isolated_between_tasks _________________ docket = worker = async def test_dependency_cache_isolated_between_tasks(docket: Docket, worker: Worker): """Dependency cache should be fresh for each task, not reused""" call_counts = {"task1": 0, "task2": 0} def dependency_for_task1() -> str: call_counts["task1"] += 1 return f"task1-call-{call_counts['task1']}" def dependency_for_task2() -> str: call_counts["task2"] += 1 return f"task2-call-{call_counts['task2']}" async def first_task(val: str = Depends(dependency_for_task1)): assert val == "task1-call-1" async def second_task(val: str = Depends(dependency_for_task2)): assert val == "task2-call-1" # Run tasks sequentially > await docket.add(first_task)() tests/test_dependencies_advanced.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_async_exit_stack_cleanup _________________________ docket = worker = async def test_async_exit_stack_cleanup(docket: Docket, worker: Worker): """AsyncExitStack should be properly cleaned up after task execution""" cleanup_called: list[str] = [] @asynccontextmanager async def tracked_resource(): try: yield "resource" finally: cleanup_called.append("cleaned") async def task_with_context(res: str = Depends(tracked_resource)): assert res == "resource" assert len(cleanup_called) == 0 # Not cleaned up yet > await docket.add(task_with_context)() tests/test_dependencies_advanced.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_dependencies_may_be_duplicated ______________________ docket = worker = async def test_dependencies_may_be_duplicated(docket: Docket, worker: Worker): called = False async def the_task( a: str, b: str, docketA: Docket = CurrentDocket(), docketB: Docket = CurrentDocket(), workerA: Worker = CurrentWorker(), workerB: Worker = CurrentWorker(), ): assert a == "a" assert b == "b" assert docketA is docket assert docketB is docket assert workerA is worker assert workerB is worker nonlocal called called = True > await docket.add(the_task)("a", "b") tests/test_dependencies_core.py:38: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_users_can_provide_dependencies_directly _________________ docket = worker = async def test_users_can_provide_dependencies_directly(docket: Docket, worker: Worker): called = False async def the_task( a: str, b: str, retry: Retry = Retry(attempts=3), ): assert a == "a" assert b == "b" assert retry.attempts == 42 nonlocal called called = True > await docket.add(the_task)("a", "b", retry=Retry(attempts=42)) tests/test_dependencies_core.py:60: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_user_provide_retries_are_used ______________________ docket = worker = async def test_user_provide_retries_are_used(docket: Docket, worker: Worker): calls = 0 async def the_task( a: str, b: str, retry: Retry = Retry(attempts=42), ): assert a == "a" assert b == "b" assert retry.attempts == 2 nonlocal calls calls += 1 raise Exception("womp womp") > await docket.add(the_task)("a", "b", retry=Retry(attempts=2)) tests/test_dependencies_core.py:84: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_user_can_request_a_retry_after_a_delay[Retry] ______________ retry_cls = docket = worker = @pytest.mark.parametrize("retry_cls", [Retry, ExponentialRetry]) async def test_user_can_request_a_retry_after_a_delay( retry_cls: Retry, docket: Docket, worker: Worker ): calls = 0 first_call_time = None second_call_time = None async def the_task( a: str, b: str, retry: Retry = retry_cls(attempts=2), # type: ignore[reportCallIssue] ): assert a == "a" assert b == "b" nonlocal calls calls += 1 nonlocal first_call_time if not first_call_time: first_call_time = datetime.now(timezone.utc) retry.after(timedelta(seconds=0.5)) else: nonlocal second_call_time second_call_time = datetime.now(timezone.utc) > await docket.add(the_task)("a", "b") tests/test_dependencies_core.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________ test_user_can_request_a_retry_after_a_delay[ExponentialRetry] _________ retry_cls = docket = worker = @pytest.mark.parametrize("retry_cls", [Retry, ExponentialRetry]) async def test_user_can_request_a_retry_after_a_delay( retry_cls: Retry, docket: Docket, worker: Worker ): calls = 0 first_call_time = None second_call_time = None async def the_task( a: str, b: str, retry: Retry = retry_cls(attempts=2), # type: ignore[reportCallIssue] ): assert a == "a" assert b == "b" nonlocal calls calls += 1 nonlocal first_call_time if not first_call_time: first_call_time = datetime.now(timezone.utc) retry.after(timedelta(seconds=0.5)) else: nonlocal second_call_time second_call_time = datetime.now(timezone.utc) > await docket.add(the_task)("a", "b") tests/test_dependencies_core.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_retry_in_is_backwards_compatible_alias_for_after _____________ docket = worker = async def test_retry_in_is_backwards_compatible_alias_for_after( docket: Docket, worker: Worker ): """retry.in_() still works as an alias for retry.after()""" calls = 0 async def the_task(retry: Retry = Retry(attempts=2)): nonlocal calls calls += 1 if calls == 1: retry.in_(timedelta(seconds=0.1)) > await docket.add(the_task)() tests/test_dependencies_core.py:143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_user_can_request_a_retry_at_a_specific_time[Retry] ____________ retry_cls = docket = worker = @pytest.mark.parametrize("retry_cls", [Retry, ExponentialRetry]) async def test_user_can_request_a_retry_at_a_specific_time( retry_cls: Retry, docket: Docket, worker: Worker ): calls = 0 first_call_time = None second_call_time = None async def the_task( a: str, b: str, retry: Retry = retry_cls(attempts=2), # type: ignore[reportCallIssue] ): assert a == "a" assert b == "b" nonlocal calls calls += 1 nonlocal first_call_time if not first_call_time: when = datetime.now(timezone.utc) + timedelta(seconds=0.5) first_call_time = datetime.now(timezone.utc) retry.at(when) else: nonlocal second_call_time second_call_time = datetime.now(timezone.utc) > await docket.add(the_task)("a", "b") tests/test_dependencies_core.py:177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______ test_user_can_request_a_retry_at_a_specific_time[ExponentialRetry] ______ retry_cls = docket = worker = @pytest.mark.parametrize("retry_cls", [Retry, ExponentialRetry]) async def test_user_can_request_a_retry_at_a_specific_time( retry_cls: Retry, docket: Docket, worker: Worker ): calls = 0 first_call_time = None second_call_time = None async def the_task( a: str, b: str, retry: Retry = retry_cls(attempts=2), # type: ignore[reportCallIssue] ): assert a == "a" assert b == "b" nonlocal calls calls += 1 nonlocal first_call_time if not first_call_time: when = datetime.now(timezone.utc) + timedelta(seconds=0.5) first_call_time = datetime.now(timezone.utc) retry.at(when) else: nonlocal second_call_time second_call_time = datetime.now(timezone.utc) > await docket.add(the_task)("a", "b") tests/test_dependencies_core.py:177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_user_can_request_a_retry_at_a_specific_time_in_the_past _________ docket = worker = async def test_user_can_request_a_retry_at_a_specific_time_in_the_past( docket: Docket, worker: Worker ): calls = 0 first_call_time = None second_call_time = None async def the_task( a: str, b: str, retry: Retry = Retry(attempts=2), ): assert a == "a" assert b == "b" nonlocal calls calls += 1 nonlocal first_call_time if not first_call_time: when = datetime.now(timezone.utc) - timedelta(days=1) first_call_time = datetime.now(timezone.utc) retry.at(when) else: nonlocal second_call_time second_call_time = datetime.now(timezone.utc) > await docket.add(the_task)("a", "b") tests/test_dependencies_core.py:217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_dependencies_error_for_missing_task_argument _______________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff95fbdb70> async def test_dependencies_error_for_missing_task_argument( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """A task will fail when asking for a missing task argument""" async def dependency_one(nope: list[str] = TaskArgument()) -> list[str]: raise NotImplementedError("This should not be called") # pragma: no cover async def dependent_task( a: list[str], b: list[str] = TaskArgument("a"), c: list[str] = Depends(dependency_one), ) -> None: raise NotImplementedError("This should not be called") # pragma: no cover > await docket.add(dependent_task)(a=["hello", "world"]) tests/test_dependencies_core.py:245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_a_task_argument_cannot_ask_for_itself __________________ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff9603a900> async def test_a_task_argument_cannot_ask_for_itself( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture ): """A task argument cannot ask for itself""" # This task would be nonsense, because it's asking for itself. async def dependent_task(a: list[str] = TaskArgument()) -> None: raise NotImplementedError("This should not be called") # pragma: no cover > await docket.add(dependent_task)() tests/test_dependencies_core.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_clear_with_immediate_tasks ________________________ docket = the_task = async def test_clear_with_immediate_tasks(docket: Docket, the_task: AsyncMock): """Should clear immediate tasks from the stream""" docket.register(the_task) > await docket.add(the_task)("arg1", kwarg1="value1") tests/test_docket_clear.py:25: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_clear_with_scheduled_tasks ________________________ docket = the_task = async def test_clear_with_scheduled_tasks(docket: Docket, the_task: AsyncMock): """Should clear scheduled future tasks from the queue""" docket.register(the_task) future = datetime.now(timezone.utc) + timedelta(seconds=60) > await docket.add(the_task, when=future)("arg1") tests/test_docket_clear.py:45: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_clear_with_mixed_tasks __________________________ docket = the_task = another_task = async def test_clear_with_mixed_tasks( docket: Docket, the_task: AsyncMock, another_task: AsyncMock ): """Should clear both immediate and scheduled tasks""" docket.register(the_task) docket.register(another_task) future = datetime.now(timezone.utc) + timedelta(seconds=60) > await docket.add(the_task)("immediate1") tests/test_docket_clear.py:68: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_clear_with_parked_tasks _________________________ docket = the_task = async def test_clear_with_parked_tasks(docket: Docket, the_task: AsyncMock): """Should clear parked tasks (tasks with specific keys)""" docket.register(the_task) > await docket.add(the_task, key="task1")("arg1") tests/test_docket_clear.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_clear_returns_total_count ________________________ docket = the_task = async def test_clear_returns_total_count(docket: Docket, the_task: AsyncMock): """Should return the total number of tasks cleared""" docket.register(the_task) future = datetime.now(timezone.utc) + timedelta(seconds=60) > await docket.add(the_task)("immediate1") tests/test_docket_clear.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_clear_no_redis_key_leaks _________________________ docket = the_task = async def test_clear_no_redis_key_leaks(docket: Docket, the_task: AsyncMock): """Should not leak Redis keys when clearing tasks""" docket.register(the_task) > await docket.add(the_task)("immediate1") tests/test_docket_clear.py:139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_clear_with_execution_ttl_zero ______________________ the_task = async def test_clear_with_execution_ttl_zero(the_task: AsyncMock): """Should delete runs hashes immediately when execution_ttl=0.""" async with Docket( name="test-docket-ttl-zero", url="memory://", execution_ttl=timedelta(0) ) as docket: docket.register(the_task) # Add both immediate and scheduled tasks > await docket.add(the_task, key="immediate1")("arg1") tests/test_docket_clear.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_docket_without_worker_does_not_create_group _______________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff95f67ab0> async def test_docket_without_worker_does_not_create_group( redis_url: str, make_docket_name: Callable[[], str] ): """A Docket used only for adding tasks should not create consumer group. Issue #206: Lazy stream/consumer group bootstrap. """ docket = Docket(name=make_docket_name(), url=redis_url) async def dummy_task(): ... async with docket: docket.register(dummy_task) for _ in range(5): > await docket.add(dummy_task)() tests/test_docket_clear.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_snapshot_handles_nogroup_with_real_redis[real] ______________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff958ea610> @pytest.mark.parametrize("redis_url", ["real"], indirect=True) async def test_snapshot_handles_nogroup_with_real_redis( redis_url: str, make_docket_name: Callable[[], str] ): """Snapshot should handle NOGROUP error and create group automatically. Issue #206: Lazy stream/consumer group bootstrap. This test uses real Redis (not memory://) to verify the NOGROUP error handling path in snapshot(), since the memory:// backend proactively creates the group to work around a fakeredis bug. """ docket = Docket(name=make_docket_name(), url=redis_url) async def dummy_task(): ... async with docket: docket.register(dummy_task) # Add a task to create the stream (but not the consumer group) > await docket.add(dummy_task)() tests/test_docket_clear.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_docket_schedule_method_with_immediate_task ________________ docket = the_task = async def test_docket_schedule_method_with_immediate_task( docket: Docket, the_task: AsyncMock ): """Test direct scheduling via docket.schedule(execution) for immediate execution.""" # Register task so snapshot can look it up docket.register(the_task) execution = Execution( docket, the_task, ("arg",), {}, "test-key", datetime.now(timezone.utc), 1 ) > await docket.schedule(execution) tests/test_docket_execution.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:506: in schedule await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_get_execution_for_scheduled_task _____________________ docket = the_task = async def test_get_execution_for_scheduled_task(docket: Docket, the_task: AsyncMock): """get_execution should return execution for scheduled task with correct data.""" docket.register(the_task) future = datetime.now(timezone.utc) + timedelta(seconds=60) > await docket.add(the_task, when=future, key="scheduled-task")( "arg1", kwarg1="value1" ) tests/test_docket_execution.py:62: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_get_execution_for_queued_task ______________________ docket = the_task = async def test_get_execution_for_queued_task(docket: Docket, the_task: AsyncMock): """get_execution should return execution for immediate (queued) task.""" docket.register(the_task) > await docket.add(the_task, key="immediate-task")("arg1", kwarg1="value1") tests/test_docket_execution.py:78: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_get_execution_function_not_registered __________________ docket = the_task = async def test_get_execution_function_not_registered( docket: Docket, the_task: AsyncMock ): """get_execution should create placeholder when function not registered in current docket.""" # Schedule a task with the function registered docket.register(the_task) > await docket.add(the_task, key="task-key")("arg1") tests/test_docket_execution.py:94: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_get_execution_with_complex_args _____________________ docket = the_task = async def test_get_execution_with_complex_args(docket: Docket, the_task: AsyncMock): """get_execution should handle complex args and kwargs.""" docket.register(the_task) complex_arg = {"nested": {"data": [1, 2, 3]}, "key": "value"} complex_kwarg = {"items": [{"id": 1}, {"id": 2}]} > await docket.add(the_task, key="complex-task")(complex_arg, data=complex_kwarg) tests/test_docket_execution.py:114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_get_execution_claim_check_pattern ____________________ docket = the_task = async def test_get_execution_claim_check_pattern(docket: Docket, the_task: AsyncMock): """Demonstrate the claim check pattern: schedule task, get key, retrieve later.""" docket.register(the_task) # Schedule a task and get the key future = datetime.now(timezone.utc) + timedelta(seconds=60) > original_execution = await docket.add( the_task, when=future, key="claim-check-task" )("important-data", priority="high") tests/test_docket_execution.py:128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_cancelled_state_creates_tombstone ____________________ docket = the_task = async def test_cancelled_state_creates_tombstone(docket: Docket, the_task: AsyncMock): """Cancelling a task should create a tombstone with CANCELLED state.""" docket.register(the_task) # Schedule a future task future = datetime.now(timezone.utc) + timedelta(seconds=60) > execution = await docket.add(the_task, when=future, key="task-to-cancel")( "arg1", kwarg1="value1" ) tests/test_docket_execution.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_cancelled_state_respects_ttl _______________________ docket = the_task = async def test_cancelled_state_respects_ttl(docket: Docket, the_task: AsyncMock): """Cancelled task tombstone should have TTL set from execution_ttl.""" docket.register(the_task) # Schedule a task future = datetime.now(timezone.utc) + timedelta(seconds=60) > execution = await docket.add(the_task, when=future, key="ttl-task")("test") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_docket_execution.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_cancelled_state_with_ttl_zero ______________________ docket = the_task = make_docket_name = ._make_name at 0x7fff95f619b0> async def test_cancelled_state_with_ttl_zero( docket: Docket, the_task: AsyncMock, make_docket_name: Callable[[], str] ): """Cancelled task with execution_ttl=0 should delete tombstone immediately.""" # Create docket with TTL=0 async with Docket( name=make_docket_name(), url=docket.url, execution_ttl=timedelta(0), ) as zero_ttl_docket: zero_ttl_docket.register(the_task) # Schedule and cancel a task future = datetime.now(timezone.utc) + timedelta(seconds=60) > execution = await zero_ttl_docket.add( the_task, when=future, key="zero-ttl-task" )("test") tests/test_docket_execution.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_get_execution_after_cancel ________________________ docket = the_task = async def test_get_execution_after_cancel(docket: Docket, the_task: AsyncMock): """get_execution should retrieve cancelled task state.""" docket.register(the_task) # Schedule task > execution = await docket.add(the_task, key="cancelled-task")("data") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_docket_execution.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_replace_does_not_set_cancelled_state ___________________ docket = the_task = async def test_replace_does_not_set_cancelled_state( docket: Docket, the_task: AsyncMock ): """replace() should not create CANCELLED state - it's a replacement, not cancellation.""" docket.register(the_task) # Schedule a task future = datetime.now(timezone.utc) + timedelta(seconds=60) > await docket.add(the_task, when=future, key="replace-task")("original") tests/test_docket_execution.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_cancellation_idempotent_with_tombstone __________________ docket = the_task = async def test_cancellation_idempotent_with_tombstone( docket: Docket, the_task: AsyncMock ): """Cancelling twice should be idempotent - second cancel sees the tombstone.""" docket.register(the_task) # Schedule a task future = datetime.now(timezone.utc) + timedelta(seconds=60) > execution = await docket.add(the_task, when=future, key="idempotent-task")("test") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_docket_execution.py:336: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_registered_task_usable_after_aenter ___________________ async def test_registered_task_usable_after_aenter(): """Tasks registered before __aenter__ should be usable inside the context.""" docket = Docket(name="test-pre-register-usable", url="memory://") async def my_task(_value: str) -> None: ... docket.register(my_task) async with docket: assert "my_task" in docket.tasks > execution = await docket.add(my_task)("test-value") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_docket_registration.py:39: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_schedule_task_by_alias __________________________ docket = worker = async def test_schedule_task_by_alias(docket: Docket, worker: Worker): """Tasks can be scheduled by their alias name.""" results: list[str] = [] async def my_task(value: str) -> None: results.append(value) docket.register(my_task, names=["task_alias"]) > await docket.add("task_alias")("hello") tests/test_docket_registration.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_run_state_scheduled ___________________________ docket = the_task = async def test_run_state_scheduled(docket: Docket, the_task: AsyncMock): """Execution should be set to QUEUED when an immediate task is added.""" > execution = await docket.add(the_task)("arg1", "arg2") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:13: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_run_state_pending_to_running _______________________ docket = worker = async def test_run_state_pending_to_running(docket: Docket, worker: Worker): """Execution should transition from QUEUED to RUNNING during execution.""" executed = asyncio.Event() async def test_task(): # Verify we're in RUNNING state executed.set() > await docket.add(test_task)() tests/test_execution_state.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_run_state_completed_on_success ______________________ docket = worker = the_task = async def test_run_state_completed_on_success( docket: Docket, worker: Worker, the_task: AsyncMock ): """Execution should be set to COMPLETED when task succeeds.""" > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_run_state_failed_on_exception ______________________ docket = worker = async def test_run_state_failed_on_exception(docket: Docket, worker: Worker): """Execution should be set to FAILED when task raises an exception.""" async def failing_task(): raise ValueError("Task failed!") > execution = await docket.add(failing_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:58: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_run_state_ttl_after_completion ______________________ docket = worker = the_task = async def test_run_state_ttl_after_completion( docket: Docket, worker: Worker, the_task: AsyncMock ): """Run state should have TTL set after completion.""" > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:70: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_custom_execution_ttl ___________________________ redis_url = 'memory://', the_task = make_docket_name = ._make_name at 0x7fff947a8d50> async def test_custom_execution_ttl( redis_url: str, the_task: AsyncMock, make_docket_name: Callable[[], str] ): """Docket should respect custom execution_ttl configuration.""" # Create docket with custom 5-minute TTL custom_ttl = timedelta(minutes=5) async with Docket( name=make_docket_name(), url=redis_url, execution_ttl=custom_ttl ) as docket: async with Worker(docket) as worker: > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:95: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_full_lifecycle_integration ________________________ docket = worker = async def test_full_lifecycle_integration(docket: Docket, worker: Worker): """Test complete lifecycle: SCHEDULED -> QUEUED -> RUNNING -> COMPLETED.""" states_observed: list[ExecutionState] = [] async def tracking_task(progress: Progress = Progress()): await progress.set_total(3) for i in range(3): await progress.increment() await progress.set_message(f"Step {i + 1}") await asyncio.sleep(0.01) # Schedule task in the future when = datetime.now(timezone.utc) + timedelta(milliseconds=50) > execution = await docket.add(tracking_task, when=when)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_run_add_returns_run_instance _______________________ docket = the_task = async def test_run_add_returns_run_instance(docket: Docket, the_task: AsyncMock): """Verify that docket.add() returns an Execution instance.""" > result = await docket.add(the_task)("arg1") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_error_message_stored_on_failure _____________________ docket = worker = async def test_error_message_stored_on_failure(docket: Docket, worker: Worker): """Failed run should store error message.""" async def failing_task(): raise RuntimeError("Something went wrong!") > execution = await docket.add(failing_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_execution_state.py:161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_mark_as_failed_without_error_message ___________________ self = keys = ['test-docket-456d512a-c61d-4e8e-80ce-0ca7cf9d0711:runs:test-key'] args = ('test-docket-456d512a-c61d-4e8e-80ce-0ca7cf9d0711:runs:test-key', '0', 'failed', '2026-03-12T17:12:23.287463+00:00', '900') client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: docket = async def test_mark_as_failed_without_error_message(docket: Docket): """Test mark_as_failed with error=None.""" execution = Execution( docket, AsyncMock(), (), {}, "test-key", datetime.now(timezone.utc), 1 ) await execution.claim("worker-1") > await execution.mark_as_failed(error=None) tests/test_execution_state.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:748: in mark_as_failed await self._mark_as_terminal( ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:706: in _mark_as_terminal await terminal_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:22: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:22: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_default_fallback_task_logs_and_acks ___________________ docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff94996dd0> the_task = key_leak_checker = async def test_default_fallback_task_logs_and_acks( docket: Docket, caplog: pytest.LogCaptureFixture, the_task: AsyncMock, key_leak_checker: KeyCountChecker, ): """Default fallback should log a warning and acknowledge the message.""" > await docket.add(the_task)() tests/test_fallback_task.py:23: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_custom_fallback_receives_original_args_kwargs ______________ docket = key_leak_checker = async def test_custom_fallback_receives_original_args_kwargs( docket: Docket, key_leak_checker: KeyCountChecker, ): """Custom fallback should receive the original task's args and kwargs.""" received_args: tuple[Any, ...] = () received_kwargs: dict[str, Any] = {} async def custom_fallback(*args: Any, **kwargs: Any) -> None: nonlocal received_args, received_kwargs received_args = args received_kwargs = kwargs async def original_task(x: int, y: str, z: bool = True) -> None: pass # pragma: no cover docket.register(original_task) > await docket.add(original_task)(42, "hello", z=False) tests/test_fallback_task.py:66: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_fallback_can_access_function_name ____________________ docket = key_leak_checker = async def test_fallback_can_access_function_name( docket: Docket, key_leak_checker: KeyCountChecker, ): """Fallback should be able to access the original function name via execution.function_name.""" captured_function_name: str | None = None async def custom_fallback( *args: Any, execution: Execution = CurrentExecution(), **kwargs: Any, ) -> None: nonlocal captured_function_name captured_function_name = execution.function_name async def my_special_task() -> None: pass # pragma: no cover docket.register(my_special_task) > await docket.add(my_special_task)() tests/test_fallback_task.py:102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_fallback_dependency_injection ______________________ docket = key_leak_checker = async def test_fallback_dependency_injection( docket: Docket, key_leak_checker: KeyCountChecker, ): """Fallback should support full dependency injection like regular tasks.""" captured_execution: Execution | None = None captured_logger: logging.LoggerAdapter[logging.Logger] | None = None async def custom_fallback( *args: Any, execution: Execution = CurrentExecution(), logger: logging.LoggerAdapter[logging.Logger] = TaskLogger(), **kwargs: Any, ) -> None: nonlocal captured_execution, captured_logger captured_execution = execution captured_logger = logger async def some_task(value: int) -> None: pass # pragma: no cover docket.register(some_task) > await docket.add(some_task)(123) tests/test_fallback_task.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_fallback_custom_user_dependency _____________________ docket = key_leak_checker = async def test_fallback_custom_user_dependency( docket: Docket, key_leak_checker: KeyCountChecker, ): """Fallback should support custom user dependencies via Depends().""" from docket.dependencies import Depends async def get_request_id() -> str: return "req-12345" captured_request_id: str | None = None async def custom_fallback( *args: Any, request_id: str = Depends(get_request_id), **kwargs: Any, ) -> None: nonlocal captured_request_id captured_request_id = request_id async def some_task() -> None: pass # pragma: no cover docket.register(some_task) > await docket.add(some_task)() tests/test_fallback_task.py:182: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_fallback_return_completes_task ______________________ docket = key_leak_checker = async def test_fallback_return_completes_task( docket: Docket, key_leak_checker: KeyCountChecker, ): """A fallback that returns normally should complete and ACK the task.""" async def custom_fallback(*args: Any, **kwargs: Any) -> str: return "handled" async def missing_task() -> None: pass # pragma: no cover docket.register(missing_task) > await docket.add(missing_task)() tests/test_fallback_task.py:209: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_fallback_exception_triggers_retry ____________________ docket = key_leak_checker = async def test_fallback_exception_triggers_retry( docket: Docket, key_leak_checker: KeyCountChecker, ): """A fallback that raises should trigger retry behavior when using Retry dependency.""" from docket import Retry call_count = 0 async def failing_fallback( *args: Any, retry: Retry = Retry(attempts=5, delay=timedelta(milliseconds=10)), **kwargs: Any, ) -> None: nonlocal call_count call_count += 1 if call_count < 3: raise ValueError("Simulated failure") async def some_task() -> None: pass # pragma: no cover docket.register(some_task) > await docket.add(some_task)() tests/test_fallback_task.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_execution_function_name_matches_for_known_tasks _____________ docket = worker = key_leak_checker = async def test_execution_function_name_matches_for_known_tasks( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ): """For known tasks, execution.function_name should match function.__name__.""" captured_function_name: str | None = None captured_function_dunder_name: str | None = None async def known_task(execution: Execution = CurrentExecution()) -> None: nonlocal captured_function_name, captured_function_dunder_name captured_function_name = execution.function_name captured_function_dunder_name = execution.function.__name__ > await docket.add(known_task)() tests/test_fallback_task.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_retrying_task_is_not_marked_as_failed __________________ docket = worker = async def test_retrying_task_is_not_marked_as_failed(docket: Docket, worker: Worker): """When FailureHandler schedules a retry, the task state should not be FAILED.""" attempts = 0 async def the_task(retry: Retry = Retry(attempts=3)): nonlocal attempts attempts += 1 raise ValueError("fail") > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_handler_semantics.py:18: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_exhausted_retries_marks_task_as_failed __________________ docket = worker = async def test_exhausted_retries_marks_task_as_failed(docket: Docket, worker: Worker): """When all retries are exhausted, the task state should be FAILED.""" attempts = 0 async def the_task(retry: Retry = Retry(attempts=2)): nonlocal attempts attempts += 1 raise ValueError("fail") > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_handler_semantics.py:39: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_failed_perpetual_task_is_rescheduled ___________________ docket = worker = async def test_failed_perpetual_task_is_rescheduled(docket: Docket, worker: Worker): """A Perpetual task that fails should still be rescheduled for next execution.""" attempts = 0 async def the_task( perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=10)), ): nonlocal attempts attempts += 1 raise ValueError("fail") > execution = await docket.add(the_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_handler_semantics.py:59: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_retry_and_perpetual_work_together ____________________ docket = worker = async def test_retry_and_perpetual_work_together(docket: Docket, worker: Worker): """A task can have both Retry and Perpetual - Retry handles failures first.""" # Track: (perpetual_run, retry_attempt, succeeded) runs: list[tuple[int, int, bool]] = [] perpetual_run = 0 async def task_with_both( retry: Retry = Retry(attempts=2), perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=10)), ): nonlocal perpetual_run # First perpetual run: fail twice (exhaust retries), then perpetual reschedules # Second perpetual run: succeed on first attempt if perpetual_run == 0: perpetual_run = 1 elif retry.attempt == 1 and len([r for r in runs if r[0] == 2]) == 0: perpetual_run = 2 should_fail = perpetual_run == 1 runs.append((perpetual_run, retry.attempt, not should_fail)) if should_fail: raise ValueError("failing first perpetual run") > execution = await docket.add(task_with_both)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_handler_semantics.py:98: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_perpetual_after_is_respected_on_failure _________________ docket = worker = async def test_perpetual_after_is_respected_on_failure(docket: Docket, worker: Worker): """Perpetual.after() delay is used even when the task fails.""" run_times: list[datetime] = [] async def failing_task(perpetual: Perpetual = Perpetual()): run_times.append(datetime.now(timezone.utc)) perpetual.after(timedelta(milliseconds=100)) raise ValueError("intentional failure") > execution = await docket.add(failing_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_handler_semantics.py:121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_leak_detection_catches_keys_without_ttl _________________ redis_url = 'memory://' docket = worker = key_leak_checker = async def test_leak_detection_catches_keys_without_ttl( redis_url: str, docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ) -> None: """Verify that the leak checker catches keys created without TTL.""" leaked_key = docket.key("leaked-key") async def task_that_leaks() -> None: """Task that intentionally creates a key without TTL.""" async with docket.redis() as redis: # Intentionally create a key without TTL await redis.set(leaked_key, "oops") docket.register(task_that_leaks) # Exempt the intentional leak from autouse checker key_leak_checker.add_exemption(leaked_key) > await docket.add(task_that_leaks)() tests/test_key_leak_protection.py:37: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_permanent_keys_are_exempt ________________________ docket = worker = key_leak_checker = async def test_permanent_keys_are_exempt( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ) -> None: """Verify that permanent infrastructure keys are not flagged as leaks.""" async def simple_task() -> None: pass docket.register(simple_task) > await docket.add(simple_task)() tests/test_key_leak_protection.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_exemption_mechanism ___________________________ redis_url = 'memory://' docket = worker = key_leak_checker = async def test_exemption_mechanism( redis_url: str, docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ) -> None: """Verify that test-specific exemptions work.""" async def task_with_special_key() -> None: """Task that creates a key we want to exempt.""" async with docket.redis() as redis: await redis.set(f"{docket.name}:special-key", "intentional") docket.register(task_with_special_key) # Exempt this specific key key_leak_checker.add_exemption(f"{docket.name}:special-key") > await docket.add(task_with_special_key)() tests/test_key_leak_protection.py:92: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_multiple_exemptions ___________________________ redis_url = 'memory://' docket = worker = key_leak_checker = async def test_multiple_exemptions( redis_url: str, docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ) -> None: """Verify that multiple exemptions can be added.""" async def task_with_multiple_keys() -> None: """Task that creates multiple keys we want to exempt.""" async with docket.redis() as redis: await redis.set(f"{docket.name}:special-key-1", "intentional") await redis.set(f"{docket.name}:special-key-2", "intentional") docket.register(task_with_multiple_keys) # Exempt both keys key_leak_checker.add_exemption(f"{docket.name}:special-key-1") key_leak_checker.add_exemption(f"{docket.name}:special-key-2") > await docket.add(task_with_multiple_keys)() tests/test_key_leak_protection.py:119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_worker_task_sets_are_exempt _______________________ docket = worker = key_leak_checker = async def test_worker_task_sets_are_exempt( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ) -> None: """Verify that worker-tasks and task-workers sets are properly handled. These sets don't have explicit TTLs but are self-cleaning via worker heartbeat expiration, so they should be exempt from leak detection. """ async def simple_task() -> None: pass docket.register(simple_task) > await docket.add(simple_task)() tests/test_key_leak_protection.py:143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_queue_is_cleaned_up ___________________________ docket = worker = key_leak_checker = async def test_queue_is_cleaned_up( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker, ) -> None: """Verify that the queue sorted set is cleaned up after tasks complete.""" async def scheduled_task() -> None: pass docket.register(scheduled_task) # Schedule a task for the future > await docket.add( scheduled_task, when=datetime.now(timezone.utc) + timedelta(seconds=1) )() tests/test_key_leak_protection.py:165: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_docket_memory_backend __________________________ self = keys = ['test-memory-docket:stream', 'test-memory-docket:known:019ce309-320b-722c-be2e-7733a77dc79e', 'test-memory-docket:019...docket:stream-id:019ce309-320b-722c-be2e-7733a77dc79e', 'test-memory-docket:runs:019ce309-320b-722c-be2e-7733a77dc79e'] args = ('test-memory-docket:stream', 'test-memory-docket:known:019ce309-320b-722c-be2e-7733a77dc79e', 'test-memory-docket:019...t:stream-id:019ce309-320b-722c-be2e-7733a77dc79e', 'test-memory-docket:runs:019ce309-320b-722c-be2e-7733a77dc79e', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: async def test_docket_memory_backend(): """Test using in-memory backend via memory:// URL.""" async with Docket(name="test-memory-docket", url="memory://") as docket: result_value = None async def simple_task(value: str) -> str: nonlocal result_value result_value = value return value docket.register(simple_task) # Add and run a task > execution = await docket.add(simple_task)("test-value") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_memory_backend.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_multiple_memory_dockets _________________________ self = keys = ['docket-1:stream', 'docket-1:known:019ce309-3431-71ed-a882-3b33176159df', 'docket-1:019ce309-3431-71ed-a882-3b3317615...ueue', 'docket-1:stream-id:019ce309-3431-71ed-a882-3b33176159df', 'docket-1:runs:019ce309-3431-71ed-a882-3b33176159df'] args = ('docket-1:stream', 'docket-1:known:019ce309-3431-71ed-a882-3b33176159df', 'docket-1:019ce309-3431-71ed-a882-3b3317615..., 'docket-1:stream-id:019ce309-3431-71ed-a882-3b33176159df', 'docket-1:runs:019ce309-3431-71ed-a882-3b33176159df', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: async def test_multiple_memory_dockets(): """Test that multiple in-memory dockets can coexist with separate data.""" async with ( Docket(name="docket-1", url="memory://") as docket1, Docket(name="docket-2", url="memory://") as docket2, ): result1 = None result2 = None async def task1(value: str) -> str: nonlocal result1 result1 = value return value async def task2(value: str) -> str: nonlocal result2 result2 = value return value docket1.register(task1) docket2.register(task2) # Add tasks to separate dockets > await docket1.add(task1)("docket1-value") tests/test_memory_backend.py:66: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_memory_backend_reuses_server _______________________ self = keys = ['docket-shared:stream', 'docket-shared:known:019ce309-35c2-7679-aaf7-28bae6cc518c', 'docket-shared:019ce309-35c2-7679...cket-shared:stream-id:019ce309-35c2-7679-aaf7-28bae6cc518c', 'docket-shared:runs:019ce309-35c2-7679-aaf7-28bae6cc518c'] args = ('docket-shared:stream', 'docket-shared:known:019ce309-35c2-7679-aaf7-28bae6cc518c', 'docket-shared:019ce309-35c2-7679...shared:stream-id:019ce309-35c2-7679-aaf7-28bae6cc518c', 'docket-shared:runs:019ce309-35c2-7679-aaf7-28bae6cc518c', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: async def test_memory_backend_reuses_server(): """Test that identical memory:// URLs share the same FakeServer instance.""" result = None async def shared_task(value: str) -> str: nonlocal result result = value return value # Create first docket and run a task async with Docket(name="docket-shared", url="memory://") as docket1: docket1.register(shared_task) > await docket1.add(shared_task)("shared-value") tests/test_memory_backend.py:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_different_memory_urls_are_isolated ____________________ self = keys = ['test:stream', 'test:known:019ce309-3765-700f-95af-df5dbe55d894', 'test:019ce309-3765-700f-95af-df5dbe55d894', 'test:queue', 'test:stream-id:019ce309-3765-700f-95af-df5dbe55d894', 'test:runs:019ce309-3765-700f-95af-df5dbe55d894'] args = ('test:stream', 'test:known:019ce309-3765-700f-95af-df5dbe55d894', 'test:019ce309-3765-700f-95af-df5dbe55d894', 'test:queue', 'test:stream-id:019ce309-3765-700f-95af-df5dbe55d894', 'test:runs:019ce309-3765-700f-95af-df5dbe55d894', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: async def test_different_memory_urls_are_isolated(): """Test that different memory:// URLs get completely separate FakeServer instances.""" result1 = None result2 = None async def task_for_server1(value: str) -> str: nonlocal result1 result1 = value return value async def task_for_server2(value: str) -> str: ... # Create two dockets with different memory:// URLs async with ( Docket(name="test", url="memory://server1") as docket1, Docket(name="test", url="memory://server2") as docket2, ): docket1.register(task_for_server1) docket2.register(task_for_server2) # Add task only to server1 > await docket1.add(task_for_server1)("value-for-server1") tests/test_memory_backend.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_memory_url_with_path_isolation ______________________ self = keys = ['test:stream', 'test:known:019ce309-395a-72d0-bfae-dffed833ee5f', 'test:019ce309-395a-72d0-bfae-dffed833ee5f', 'test:queue', 'test:stream-id:019ce309-395a-72d0-bfae-dffed833ee5f', 'test:runs:019ce309-395a-72d0-bfae-dffed833ee5f'] args = ('test:stream', 'test:known:019ce309-395a-72d0-bfae-dffed833ee5f', 'test:019ce309-395a-72d0-bfae-dffed833ee5f', 'test:queue', 'test:stream-id:019ce309-395a-72d0-bfae-dffed833ee5f', 'test:runs:019ce309-395a-72d0-bfae-dffed833ee5f', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: async def test_memory_url_with_path_isolation(): """Test that memory:// URLs with different paths are isolated.""" async with ( Docket(name="test", url="memory://localhost/db1") as docket1, Docket(name="test", url="memory://localhost/db2") as docket2, ): async def dummy_task() -> None: ... docket1.register(dummy_task) # Add task to db1 > await docket1.add(dummy_task)() tests/test_memory_backend.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _ test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=0] _ self = keys = ['test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:stream', 'test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:known:pe...10d9c8ae06:stream-id:perpetual-race-test', 'test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:runs:perpetual-race-test'] args = ('test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:stream', 'test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:known:pe...8ae06:stream-id:perpetual-race-test', 'test-docket-75efdb69-76ed-4cb9-89ed-b410d9c8ae06:runs:perpetual-race-test', ...) client = ,db=0)>)>)> async def __call__( self, keys: Union[Sequence[KeyT], None] = None, args: Union[Iterable[EncodableT], None] = None, client: Union["redis.asyncio.client.Redis", None] = None, ): """Execute the script, passing any required ``args``""" keys = keys or [] args = args or [] if client is None: client = self.registered_client args = tuple(keys) + tuple(args) # make sure the Redis server knows about the script from redis.asyncio.client import Pipeline if isinstance(client, Pipeline): # Make sure the pipeline can register the script before executing. client.scripts.add(self) try: > return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {}, response = NoScriptError('No matching script. Please use EVAL.') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.NoScriptError: No matching script. Please use EVAL. /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: NoScriptError During handling of the above exception, another exception occurred: docket = worker = async def test_stale_perpetual_on_complete_overwrites_correct_successor( docket: Docket, worker: Worker ): """When a running Perpetual task is externally replaced and finishes after the replacement, its on_complete overwrites the correctly-timed successor.""" config: dict[str, timedelta] = {"interval": STALE_INTERVAL} task_a_started = asyncio.Event() let_a_finish = asyncio.Event() task_b_started = asyncio.Event() let_b_finish = asyncio.Event() executions: list[Execution] = [] async def racing_task( perpetual: Perpetual = Perpetual(), execution: Execution = CurrentExecution(), ): my_interval = config["interval"] call_number = len(executions) + 1 executions.append(execution) if call_number == 1: # Task A: signal start, block until released task_a_started.set() await asyncio.wait_for(let_a_finish.wait(), timeout=10) elif call_number == 2: # Task B: signal start, block until released task_b_started.set() await asyncio.wait_for(let_b_finish.wait(), timeout=10) # Task C (call 3): the successor — just runs perpetual.after(my_interval) # Schedule the initial task (A) > await docket.add(racing_task, key=TASK_KEY)() tests/test_perpetual_race.py:87: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5578: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _ test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=60s] _ docket = worker = async def test_stale_perpetual_on_complete_overwrites_correct_successor( docket: Docket, worker: Worker ): """When a running Perpetual task is externally replaced and finishes after the replacement, its on_complete overwrites the correctly-timed successor.""" config: dict[str, timedelta] = {"interval": STALE_INTERVAL} task_a_started = asyncio.Event() let_a_finish = asyncio.Event() task_b_started = asyncio.Event() let_b_finish = asyncio.Event() executions: list[Execution] = [] async def racing_task( perpetual: Perpetual = Perpetual(), execution: Execution = CurrentExecution(), ): my_interval = config["interval"] call_number = len(executions) + 1 executions.append(execution) if call_number == 1: # Task A: signal start, block until released task_a_started.set() await asyncio.wait_for(let_a_finish.wait(), timeout=10) elif call_number == 2: # Task B: signal start, block until released task_b_started.set() await asyncio.wait_for(let_b_finish.wait(), timeout=10) # Task C (call 3): the successor — just runs perpetual.after(my_interval) # Schedule the initial task (A) > await docket.add(racing_task, key=TASK_KEY)() tests/test_perpetual_race.py:87: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_is_superseded_after_replace[execution_ttl=0] _______________ docket = async def test_is_superseded_after_replace(docket: Docket): """An execution becomes superseded when the same key is rescheduled.""" async def noop(): pass # pragma: no cover > await docket.add(noop, key="gen-test")() tests/test_perpetual_race.py:139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_is_superseded_after_replace[execution_ttl=60s] ______________ docket = async def test_is_superseded_after_replace(docket: Docket): """An execution becomes superseded when the same key is rescheduled.""" async def noop(): pass # pragma: no cover > await docket.add(noop, key="gen-test")() tests/test_perpetual_race.py:139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______ test_superseded_message_skipped_before_execution[execution_ttl=0] _______ docket = worker = async def test_superseded_message_skipped_before_execution( docket: Docket, worker: Worker ): """A stale message in the stream is skipped without running the function. This covers the case where a message was already pending (e.g. after a worker crash and redelivery) when the task was replaced. The runs hash has a newer generation so the worker bails before claim(). """ calls: list[str] = [] async def tracked_task(): calls.append("ran") # pragma: no cover > await docket.add(tracked_task, key="head-check")() tests/test_perpetual_race.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____ test_superseded_message_skipped_before_execution[execution_ttl=60s] ______ docket = worker = async def test_superseded_message_skipped_before_execution( docket: Docket, worker: Worker ): """A stale message in the stream is skipped without running the function. This covers the case where a message was already pending (e.g. after a worker crash and redelivery) when the task was replaced. The runs hash has a newer generation so the worker bails before claim(). """ calls: list[str] = [] async def tracked_task(): calls.append("ran") # pragma: no cover > await docket.add(tracked_task, key="head-check")() tests/test_perpetual_race.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______ test_old_message_without_generation_runs_normally[execution_ttl=0] ______ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_perpetual_race.py", line 239, in test_old_message_without_generation_runs_normally | await worker.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.NoScriptError: No matching script. Please use EVAL. | | During handling of the above exception, another exception occurred: | | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute | await execution.mark_as_completed(result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5578, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk | | During handling of the above exception, another exception occurred: | | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 571, in _worker_loop | await process_completed_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 520, in process_completed_tasks | await task | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 998, in _execute | await execution.mark_as_failed(error_msg, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 748, in mark_as_failed | await self._mark_as_terminal( | ExecutionState.FAILED, error=error, result_key=result_key | ) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* legacy_task() DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Scheduling due tasks INFO:docket.worker:↪ [ 10ms] legacy_task(){old-to-new} ERROR:docket.worker:↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.NoScriptError: No matching script. Please use EVAL. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5578, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * legacy_task() DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:678 Scheduling due tasks INFO docket.worker:worker.py:828 ↪ [ 10ms] legacy_task(){old-to-new} ERROR docket.worker:worker.py:975 ↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.NoScriptError: No matching script. Please use EVAL. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5578, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk _____ test_old_message_without_generation_runs_normally[execution_ttl=60s] _____ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_perpetual_race.py", line 239, in test_old_message_without_generation_runs_normally | await worker.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute | await execution.mark_as_completed(result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk | | During handling of the above exception, another exception occurred: | | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 571, in _worker_loop | await process_completed_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 520, in process_completed_tasks | await task | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 998, in _execute | await execution.mark_as_failed(error_msg, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 748, in mark_as_failed | await self._mark_as_terminal( | ExecutionState.FAILED, error=error, result_key=result_key | ) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* legacy_task() DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Getting redeliveries DEBUG:docket.worker:Getting new deliveries DEBUG:docket.worker:Scheduling due tasks INFO:docket.worker:↪ [ 9ms] legacy_task(){old-to-new} ERROR:docket.worker:↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * legacy_task() DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:444 Getting redeliveries DEBUG docket.worker:worker.py:465 Getting new deliveries DEBUG docket.worker:worker.py:678 Scheduling due tasks INFO docket.worker:worker.py:828 ↪ [ 9ms] legacy_task(){old-to-new} ERROR docket.worker:worker.py:975 ↩ [ 1ms] legacy_task(){old-to-new} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk _____ test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=0] ______ docket = worker = async def test_new_task_moved_by_old_scheduler_runs_normally( docket: Docket, worker: Worker ): """A task scheduled by new code but moved to stream by old scheduler runs. Simulates the new→old scheduler→new worker upgrade path: new code schedules a task (generation=1 in runs hash), but an older worker's scheduler Lua moves it from queue to stream WITHOUT the generation field. The new worker receives generation=0 from the message, sees generation=1 in the runs hash, but still runs the task because generation=0 is treated as "pre-tracking". """ calls: list[str] = [] async def upgraded_task(): calls.append("ran") docket.register(upgraded_task) # Schedule with new code to get generation=1 in the runs hash future = datetime.now(timezone.utc) + timedelta(hours=1) > await docket.add(upgraded_task, when=future, key="new-old-new")() tests/test_perpetual_race.py:264: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____ test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=60s] _____ docket = worker = async def test_new_task_moved_by_old_scheduler_runs_normally( docket: Docket, worker: Worker ): """A task scheduled by new code but moved to stream by old scheduler runs. Simulates the new→old scheduler→new worker upgrade path: new code schedules a task (generation=1 in runs hash), but an older worker's scheduler Lua moves it from queue to stream WITHOUT the generation field. The new worker receives generation=0 from the message, sees generation=1 in the runs hash, but still runs the task because generation=0 is treated as "pre-tracking". """ calls: list[str] = [] async def upgraded_task(): calls.append("ran") docket.register(upgraded_task) # Schedule with new code to get generation=1 in the runs hash future = datetime.now(timezone.utc) + timedelta(hours=1) > await docket.add(upgraded_task, when=future, key="new-old-new")() tests/test_perpetual_race.py:264: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_replace_skips_stale_stream_message[execution_ttl=0] ___________ docket = worker = async def test_replace_skips_stale_stream_message(docket: Docket, worker: Worker): """When replace() can't XDEL a message the worker already read, the generation check prevents the stale message from executing. This simulates the tight race where XREADGROUP delivers a message to the worker before replace()'s XDEL can remove it, so the stream has both the old (gen=1) and new (gen=2) messages. Only gen=2 should run. """ calls: list[int] = [] async def tracked_task( execution: Execution = CurrentExecution(), ): calls.append(execution.generation) # Schedule the task (gen=1 in stream and runs hash) > await docket.add(tracked_task, key="replace-race")() tests/test_perpetual_race.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________ test_replace_skips_stale_stream_message[execution_ttl=60s] __________ docket = worker = async def test_replace_skips_stale_stream_message(docket: Docket, worker: Worker): """When replace() can't XDEL a message the worker already read, the generation check prevents the stale message from executing. This simulates the tight race where XREADGROUP delivers a message to the worker before replace()'s XDEL can remove it, so the stream has both the old (gen=1) and new (gen=2) messages. Only gen=2 should run. """ calls: list[int] = [] async def tracked_task( execution: Execution = CurrentExecution(), ): calls.append(execution.generation) # Schedule the task (gen=1 in stream and runs hash) > await docket.add(tracked_task, key="replace-race")() tests/test_perpetual_race.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____ test_perpetual_successor_survives_mark_as_terminal[execution_ttl=0] ______ docket = worker = async def test_perpetual_successor_survives_mark_as_terminal( docket: Docket, worker: Worker ): """After a Perpetual task completes, the successor's runs hash must survive. on_complete calls docket.replace() which writes the successor's state (generation, state=queued/scheduled) to the runs hash. Then the worker calls mark_as_completed → _mark_as_terminal which either overwrites state=queued with state=completed or (with execution_ttl=0) deletes the entire hash. Either way the successor becomes invisible. """ before = datetime.now(timezone.utc) executed = asyncio.Event() async def simple_perpetual(perpetual: Perpetual = Perpetual()): perpetual.after(timedelta(hours=1)) executed.set() key = "successor-survives" > await docket.add(simple_perpetual, key=key)() tests/test_perpetual_race.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____ test_perpetual_successor_survives_mark_as_terminal[execution_ttl=60s] _____ docket = worker = async def test_perpetual_successor_survives_mark_as_terminal( docket: Docket, worker: Worker ): """After a Perpetual task completes, the successor's runs hash must survive. on_complete calls docket.replace() which writes the successor's state (generation, state=queued/scheduled) to the runs hash. Then the worker calls mark_as_completed → _mark_as_terminal which either overwrites state=queued with state=completed or (with execution_ttl=0) deletes the entire hash. Either way the successor becomes invisible. """ before = datetime.now(timezone.utc) executed = asyncio.Event() async def simple_perpetual(perpetual: Perpetual = Perpetual()): perpetual.after(timedelta(hours=1)) executed.set() key = "successor-survives" > await docket.add(simple_perpetual, key=key)() tests/test_perpetual_race.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_perpetual_task_with_ttl_zero _______________________ zero_ttl_docket = async def test_perpetual_task_with_ttl_zero(zero_ttl_docket: Docket) -> None: """Perpetual tasks should work correctly with TTL of 0.""" executions: list[str] = [] async def perpetual_task( execution: Execution = CurrentExecution(), perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=10)), ) -> None: executions.append(execution.key) if len(executions) >= 3: perpetual.cancel() zero_ttl_docket.register(perpetual_task) async with Worker(docket=zero_ttl_docket) as worker: > execution = await zero_ttl_docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:30: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_perpetual_task_state_isolation ______________________ docket = worker = async def test_perpetual_task_state_isolation(docket: Docket, worker: Worker) -> None: """Perpetual tasks with the same key should execute independently.""" executions: list[str] = [] async def perpetual_task( execution: Execution = CurrentExecution(), perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=10)), ) -> None: executions.append(execution.key) if len(executions) >= 3: perpetual.cancel() docket.register(perpetual_task) > execution = await docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:51: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_perpetual_task_no_state_accumulation_with_ttl_zero ____________ zero_ttl_docket = async def test_perpetual_task_no_state_accumulation_with_ttl_zero( zero_ttl_docket: Docket, ) -> None: """Perpetual tasks with TTL=0 should not accumulate state records.""" executions: list[str] = [] async def perpetual_task( execution: Execution = CurrentExecution(), perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=10)), ) -> None: executions.append(execution.key) if len(executions) >= 5: perpetual.cancel() zero_ttl_docket.register(perpetual_task) async with Worker(docket=zero_ttl_docket) as worker: > execution = await zero_ttl_docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_rapid_perpetual_tasks_no_conflicts ____________________ docket = worker = async def test_rapid_perpetual_tasks_no_conflicts( docket: Docket, worker: Worker ) -> None: """Rapid perpetual tasks should not have state conflicts.""" executions: list[str] = [] async def rapid_perpetual( execution: Execution = CurrentExecution(), perpetual: Perpetual = Perpetual(every=timedelta(0)), ) -> None: executions.append(execution.key) if len(executions) >= 10: perpetual.cancel() docket.register(rapid_perpetual) > execution = await docket.add(rapid_perpetual)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_perpetual_same_key_no_state_accumulation _________________ docket = worker = async def test_perpetual_same_key_no_state_accumulation( docket: Docket, worker: Worker ) -> None: """Multiple cycles of perpetual task with same key should not accumulate state records.""" executions: list[str] = [] async def perpetual_task( execution: Execution = CurrentExecution(), perpetual: Perpetual = Perpetual(every=timedelta(0)), ) -> None: executions.append(execution.key) if len(executions) >= 10: perpetual.cancel() docket.register(perpetual_task) > execution = await docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:129: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_perpetual_task_state_transitions_with_same_key ______________ docket = worker = async def test_perpetual_task_state_transitions_with_same_key( docket: Docket, worker: Worker ) -> None: """Each cycle of a perpetual task should use the same key.""" executions: list[str] = [] async def perpetual_tracking_keys( execution: Execution = CurrentExecution(), perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=20)), ) -> None: executions.append(execution.key) if len(executions) >= 5: perpetual.cancel() docket.register(perpetual_tracking_keys) > execution = await docket.add(perpetual_tracking_keys)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_perpetual_publishes_completed_event[ttl_zero] ______________ pubsub_docket = async def test_perpetual_publishes_completed_event(pubsub_docket: Docket): """Perpetual tasks must still publish a completed state event on pub/sub. _mark_as_terminal skips the runs hash write when the successor has already been scheduled, but the pub/sub notification must still fire so that callers waiting via execution.subscribe() or get_result() see completion. """ async def simple_perpetual(perpetual: Perpetual = Perpetual()): perpetual.after(timedelta(hours=1)) > execution = await pubsub_docket.add(simple_perpetual, key="pubsub-completion")() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:209: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_perpetual_publishes_completed_event[default_ttl] _____________ pubsub_docket = async def test_perpetual_publishes_completed_event(pubsub_docket: Docket): """Perpetual tasks must still publish a completed state event on pub/sub. _mark_as_terminal skips the runs hash write when the successor has already been scheduled, but the pub/sub notification must still fire so that callers waiting via execution.subscribe() or get_result() see completion. """ async def simple_perpetual(perpetual: Perpetual = Perpetual()): perpetual.after(timedelta(hours=1)) > execution = await pubsub_docket.add(simple_perpetual, key="pubsub-completion")() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_perpetual_state.py:209: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_progress_dependency_injection ______________________ docket = worker = async def test_progress_dependency_injection(docket: Docket, worker: Worker): """Progress dependency should be injected into task functions.""" progress_values: list[int] = [] async def task_with_progress(progress: Progress = Progress()): await progress.set_total(10) for i in range(10): await asyncio.sleep(0.001) await progress.increment() await progress.set_message(f"Processing item {i + 1}") # Capture progress data assert progress.current is not None progress_values.append(progress.current) > await docket.add(task_with_progress)() tests/test_progress_basics.py:108: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_progress_deleted_on_completion ______________________ docket = worker = async def test_progress_deleted_on_completion(docket: Docket, worker: Worker): """Progress data should be deleted when task completes.""" async def task_with_progress(progress: Progress = Progress()): await progress.set_total(5) await progress.increment() > execution = await docket.add(task_with_progress)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_basics.py:124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_progress_with_multiple_increments ____________________ docket = worker = async def test_progress_with_multiple_increments(docket: Docket, worker: Worker): """Test progress tracking with realistic usage pattern.""" async def process_items(items: list[int], progress: Progress = Progress()): await progress.set_total(len(items)) await progress.set_message("Starting processing") for i in range(len(items)): await asyncio.sleep(0.001) # Simulate work await progress.increment() await progress.set_message(f"Processed item {i + 1}/{len(items)}") await progress.set_message("All items processed") items = list(range(20)) > execution = await docket.add(process_items)(items) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_basics.py:152: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_progress_without_total __________________________ docket = worker = async def test_progress_without_total(docket: Docket, worker: Worker): """Progress should work even without setting total.""" async def task_without_total(progress: Progress = Progress()): for _ in range(5): await progress.increment() await asyncio.sleep(0.001) > execution = await docket.add(task_without_total)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_basics.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________________ test_state_publish_events ___________________________ docket = the_task = async def test_state_publish_events(docket: Docket, the_task: AsyncMock): """State changes should publish events to pub/sub channel.""" # Note: This test verifies the pub/sub mechanism works. # Pub/sub is skipped for memory:// backend, so this test effectively # documents the expected behavior for real Redis backends. > execution = await docket.add(the_task, key="test-key")() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_pubsub.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_completed_state_publishes_event _____________________ execution = async def test_completed_state_publishes_event(execution: Execution): """Completed state should publish event with completed_at timestamp.""" # Set up subscriber events: list[StateEvent] = [] async def collect_events(): async for event in execution.subscribe(): # pragma: no cover if event["type"] == "state": events.append(event) if any(e["state"] == ExecutionState.COMPLETED for e in events): break subscriber_task = asyncio.create_task(collect_events()) await asyncio.sleep(0.1) await execution.claim("worker-1") > await execution.mark_as_completed() tests/test_progress_pubsub.py:165: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:737: in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:706: in _mark_as_terminal await terminal_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:22: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:22: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_failed_state_publishes_event_with_error _________________ execution = async def test_failed_state_publishes_event_with_error(execution: Execution): """Failed state should publish event with error message.""" # Set up subscriber events: list[StateEvent] = [] async def collect_events(): async for event in execution.subscribe(): # pragma: no cover if event["type"] == "state": events.append(event) if any(e["state"] == ExecutionState.FAILED for e in events): break subscriber_task = asyncio.create_task(collect_events()) await asyncio.sleep(0.1) await execution.claim("worker-1") > await execution.mark_as_failed("Something went wrong!") tests/test_progress_pubsub.py:191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:748: in mark_as_failed await self._mark_as_terminal( ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:706: in _mark_as_terminal await terminal_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:22: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:22: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_end_to_end_progress_monitoring_with_worker ________________ docket = worker = async def test_end_to_end_progress_monitoring_with_worker( docket: Docket, worker: Worker ): """Test complete end-to-end progress monitoring with real worker execution.""" collected_events: list[StateEvent | ProgressEvent] = [] async def task_with_progress(progress: Progress = Progress()): """Task that reports progress as it executes.""" await progress.set_total(5) await progress.set_message("Starting work") for i in range(5): await asyncio.sleep(0.01) await progress.increment() await progress.set_message(f"Processing step {i + 1}/5") await progress.set_message("Work complete") # Schedule the task > execution = await docket.add(task_with_progress)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_pubsub.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_end_to_end_failed_task_monitoring ____________________ docket = worker = async def test_end_to_end_failed_task_monitoring(docket: Docket, worker: Worker): """Test progress monitoring for a task that fails.""" collected_events: list[StateEvent | ProgressEvent] = [] async def failing_task(progress: Progress = Progress()): """Task that reports progress then fails.""" await progress.set_total(10) await progress.set_message("Starting work") await progress.increment(3) await progress.set_message("About to fail") raise ValueError("Task failed intentionally") # Schedule the task > execution = await docket.add(failing_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_pubsub.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_subscribing_to_completed_execution ____________________ docket = worker = async def test_subscribing_to_completed_execution(docket: Docket, worker: Worker): """Subscribing to already-completed executions should emit final state.""" async def completed_task(): await asyncio.sleep(0.01) async def failed_task(): await asyncio.sleep(0.01) raise ValueError("Task failed") # Test subscribing to a completed task > execution = await docket.add(completed_task, key="already-done:123")() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_progress_pubsub.py:362: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_redelivery_from_abandoned_worker _____________________ docket = the_task = async def test_redelivery_from_abandoned_worker(docket: Docket, the_task: AsyncMock): """Tasks should be redelivered when a worker crashes or abandons them.""" > await docket.add(the_task)() tests/test_redelivery.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_long_running_task_not_duplicated _____________________ docket = async def test_long_running_task_not_duplicated(docket: Docket): """Test that lease renewal prevents duplicate execution when task exceeds redelivery_timeout. This test runs a task that takes 500ms with a 200ms redelivery_timeout. Without lease renewal, XAUTOCLAIM would reclaim the message after 200ms, causing duplicate execution. With lease renewal (every 50ms), the message stays claimed and no duplicates occur. """ executions: list[int] = [] async def slow_task(task_id: int): executions.append(task_id) await asyncio.sleep(0.5) > await docket.add(slow_task, key="slow-1")(task_id=1) tests/test_redelivery.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_retry_with_long_running_task _______________________ docket = async def test_retry_with_long_running_task(docket: Docket): """Test that retries work correctly with lease renewal. A task that fails and retries should still benefit from lease renewal. Each attempt should be a distinct execution without duplicates. """ attempts: list[tuple[str, int]] = [] async def flaky_task( task_id: str, retry: Retry = Retry(attempts=3, delay=timedelta(milliseconds=50)), ): attempts.append((task_id, retry.attempt)) await asyncio.sleep(0.3) if retry.attempt < 3: raise ValueError("Temporary failure") > await docket.add(flaky_task, key="flaky")(task_id="test") tests/test_redelivery.py:113: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_multiple_workers_no_duplicate_execution _________________ docket = async def test_multiple_workers_no_duplicate_execution(docket: Docket): """Test that lease renewal prevents duplicates across multiple competing workers. With multiple workers and tasks that run longer than redelivery_timeout, XAUTOCLAIM could reclaim a message from one worker and deliver it to another. Lease renewal prevents this by keeping messages "fresh" while being processed. """ executions: list[tuple[str, int]] = [] lock = asyncio.Lock() async def slow_task(task_id: int, worker_name: str = ""): async with lock: executions.append((worker_name, task_id)) await asyncio.sleep(0.5) # Schedule several tasks for i in range(6): > await docket.add(slow_task, key=f"task-{i}")(task_id=i) tests/test_redelivery.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_perpetual_task_with_lease_renewal ____________________ docket = async def test_perpetual_task_with_lease_renewal(docket: Docket): """Perpetual tasks that run longer than redelivery_timeout should reschedule correctly. Without lease renewal, a perpetual task running longer than redelivery_timeout could be reclaimed by XAUTOCLAIM, causing duplicate execution. """ executions: list[int] = [] async def slow_perpetual( perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=100)), ): executions.append(len(executions) + 1) await asyncio.sleep(0.4) # Longer than redelivery_timeout > await docket.add(slow_perpetual, key="perpetual")() tests/test_redelivery.py:187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_user_timeout_longer_than_redelivery ___________________ docket = async def test_user_timeout_longer_than_redelivery(docket: Docket): """User-specified Timeout > redelivery_timeout should work with lease renewal. Lease renewal allows tasks to run longer than redelivery_timeout without being reclaimed by XAUTOCLAIM. """ task_completed = False async def long_task_with_timeout( timeout: Timeout = Timeout(timedelta(seconds=2)), ): nonlocal task_completed await asyncio.sleep(0.5) # Longer than redelivery_timeout task_completed = True > await docket.add(long_task_with_timeout)() tests/test_redelivery.py:217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_workers_with_same_redelivery_timeout ___________________ docket = async def test_workers_with_same_redelivery_timeout(docket: Docket): """Workers with consistent redelivery_timeouts should coexist correctly. When workers share the same redelivery_timeout, their lease renewal intervals are synchronized, preventing incorrect task reclamation via XAUTOCLAIM. Note: Workers with different redelivery_timeouts can cause issues - a worker with a shorter timeout may reclaim tasks from a worker with a longer timeout. Use consistent timeouts across workers in a cluster. """ executions: list[tuple[str, int]] = [] lock = asyncio.Lock() async def tracked_task(task_id: int): async with lock: executions.append(("started", task_id)) await asyncio.sleep(0.5) # Longer than redelivery_timeout async with lock: executions.append(("completed", task_id)) for i in range(4): > await docket.add(tracked_task, key=f"task-{i}")(task_id=i) tests/test_redelivery.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_worker_joining_doesnt_steal_renewed_lease ________________ docket = async def test_worker_joining_doesnt_steal_renewed_lease(docket: Docket): """A new worker joining shouldn't steal tasks that are actively being renewed. Worker A starts a task and renews its lease. Worker B joins later and runs XAUTOCLAIM, but shouldn't reclaim A's actively-renewed task. """ executions: list[tuple[str, int]] = [] task_started = asyncio.Event() async def slow_task(task_id: int): executions.append(("start", task_id)) task_started.set() await asyncio.sleep(0.6) # Long task executions.append(("end", task_id)) > await docket.add(slow_task, key="task")(task_id=1) tests/test_redelivery.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_lease_renewal_recovers_from_redis_error _________________ docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff9430f930> async def test_lease_renewal_recovers_from_redis_error( docket: Docket, caplog: pytest.LogCaptureFixture ): """Lease renewal should recover from transient Redis errors. If XCLAIM fails, the worker should log a warning and continue. The task should still complete successfully. """ task_completed = False async def slow_task(): nonlocal task_completed await asyncio.sleep(0.5) task_completed = True > await docket.add(slow_task)() tests/test_redelivery.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_get_result_waits_for_completion _____________________ docket = worker = async def test_get_result_waits_for_completion(docket: Docket, worker: Worker): """Test that get_result waits for execution to complete.""" result_value = 123 async def slow_task() -> int: await asyncio.sleep(0.1) return result_value docket.register(slow_task) > execution = await docket.add(slow_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:22: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_get_result_timeout ____________________________ docket = worker = async def test_get_result_timeout(docket: Docket, worker: Worker): """Test that get_result respects timeout.""" event = asyncio.Event() # Never set, simulates hung task async def hung_task(): await event.wait() docket.register(hung_task) > execution = await docket.add(hung_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:42: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_multiple_concurrent_get_result_calls ___________________ docket = worker = async def test_multiple_concurrent_get_result_calls(docket: Docket, worker: Worker): """Test that multiple concurrent get_result calls work correctly.""" result_value = 999 async def returns_value() -> int: await asyncio.sleep(0.05) return result_value docket.register(returns_value) > execution = await docket.add(returns_value)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:66: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_get_result_on_already_completed_task ___________________ docket = worker = async def test_get_result_on_already_completed_task(docket: Docket, worker: Worker): """Test get_result on an already completed task.""" result_value = 555 async def returns_value() -> int: return result_value docket.register(returns_value) > execution = await docket.add(returns_value)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:92: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_get_result_on_already_failed_task ____________________ docket = worker = async def test_get_result_on_already_failed_task(docket: Docket, worker: Worker): """Test get_result on an already failed task.""" async def raises_error() -> int: raise ValueError("test error") docket.register(raises_error) > execution = await docket.add(raises_error)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:111: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_get_result_with_malformed_result_data __________________ docket = worker = async def test_get_result_with_malformed_result_data(docket: Docket, worker: Worker): """Test get_result gracefully handles malformed result data.""" async def returns_value() -> int: return 123 docket.register(returns_value) > execution = await docket.add(returns_value)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:165: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_get_result_failed_task_with_missing_exception_data ____________ docket = worker = async def test_get_result_failed_task_with_missing_exception_data( docket: Docket, worker: Worker ): """Test get_result on failed task when exception data is missing from storage.""" async def raises_error() -> int: raise ValueError("test error") docket.register(raises_error) > execution = await docket.add(raises_error)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_get_result_with_timeout_timedelta ____________________ docket = worker = async def test_get_result_with_timeout_timedelta(docket: Docket, worker: Worker): """Test get_result using timeout parameter (timedelta).""" async def returns_value() -> int: return 42 docket.register(returns_value) > execution = await docket.add(returns_value)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_get_result_with_deadline_datetime ____________________ docket = worker = async def test_get_result_with_deadline_datetime(docket: Docket, worker: Worker): """Test get_result using deadline parameter (datetime).""" async def returns_value() -> int: return 42 docket.register(returns_value) > execution = await docket.add(returns_value)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_get_result_timeout_on_pending_task ____________________ docket = worker = async def test_get_result_timeout_on_pending_task(docket: Docket, worker: Worker): """Test get_result with timeout (timedelta) on pending task.""" event = asyncio.Event() async def waits_forever() -> int: await event.wait() return 42 docket.register(waits_forever) > execution = await docket.add(waits_forever)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_retrieval.py:268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_result_storage_for_int_return ______________________ docket = worker = async def test_result_storage_for_int_return(docket: Docket, worker: Worker): """Test that int results are stored and retrievable.""" result_value = 42 async def returns_int() -> int: return result_value docket.register(returns_int) > execution = await docket.add(returns_int)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_result_storage_for_str_return ______________________ docket = worker = async def test_result_storage_for_str_return(docket: Docket, worker: Worker): """Test that string results are stored and retrievable.""" result_value = "hello world" async def returns_str() -> str: return result_value docket.register(returns_str) > execution = await docket.add(returns_str)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:54: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_result_storage_for_dict_return ______________________ docket = worker = async def test_result_storage_for_dict_return(docket: Docket, worker: Worker): """Test that dict results are stored and retrievable.""" result_value = {"key": "value", "number": 123} async def returns_dict() -> dict[str, Any]: return result_value docket.register(returns_dict) > execution = await docket.add(returns_dict)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:74: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_result_storage_for_object_return _____________________ docket = worker = async def test_result_storage_for_object_return(docket: Docket, worker: Worker): """Test that object results are stored and retrievable.""" class CustomObject: def __init__(self, value: int): self.value = value def __eq__(self, other: Any) -> bool: return isinstance(other, CustomObject) and self.value == other.value result_value = CustomObject(42) async def returns_object() -> CustomObject: return result_value docket.register(returns_object) > execution = await docket.add(returns_object)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_no_storage_for_none_annotated_task ____________________ docket = worker = async def test_no_storage_for_none_annotated_task(docket: Docket, worker: Worker): """Test that tasks annotated with -> None don't store results.""" async def returns_none_annotated() -> None: pass docket.register(returns_none_annotated) > execution = await docket.add(returns_none_annotated)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_no_storage_for_runtime_none _______________________ docket = worker = async def test_no_storage_for_runtime_none(docket: Docket, worker: Worker): """Test that tasks returning None at runtime don't store results.""" async def returns_none_runtime() -> int | None: return None docket.register(returns_none_runtime) > execution = await docket.add(returns_none_runtime)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________________ test_exception_storage_and_retrieval _____________________ docket = worker = async def test_exception_storage_and_retrieval(docket: Docket, worker: Worker): """Test that exceptions are stored and re-raised.""" error_msg = "Test error" error_code = 500 async def raises_error() -> int: raise CustomError(error_msg, error_code) docket.register(raises_error) > execution = await docket.add(raises_error)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_result_key_stored_in_execution_record __________________ docket = worker = async def test_result_key_stored_in_execution_record(docket: Docket, worker: Worker): """Test that result key is stored in execution record.""" async def returns_value() -> int: return 123 docket.register(returns_value) > execution = await docket.add(returns_value)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_results_storage.py:187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_strike_incomparable_values[>-42-string] _________________ operator = '>', value = 42, test_value = 'string' docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff949e6270> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.GREATER_THAN 'string' Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 429, in _is_match return value > strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '>' not supported between instances of 'str' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.GREATER_THAN 'string' Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 429, in _is_match return value > strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '>' not supported between instances of 'str' and 'int' _________________ test_strike_incomparable_values[<-string-42] _________________ operator = '<', value = 'string', test_value = 42 docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f49a270> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 'string' Operator.LESS_THAN 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'int' and 'str' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 'string' Operator.LESS_THAN 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'int' and 'str' _________________ test_strike_incomparable_values[>=-None-42] __________________ operator = '>=', value = None, test_value = 42 docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f4de890> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: None Operator.GREATER_THAN_OR_EQUAL 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 431, in _is_match return value >= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '>=' not supported between instances of 'int' and 'NoneType' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: None Operator.GREATER_THAN_OR_EQUAL 42 Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 431, in _is_match return value >= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '>=' not supported between instances of 'int' and 'NoneType' _________________ test_strike_incomparable_values[<=-42-None] __________________ operator = '<=', value = 42, test_value = None docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f4df000> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL None Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'NoneType' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL None Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'NoneType' and 'int' _________________ test_strike_incomparable_values[>-value4-42] _________________ operator = '>', value = {}, test_value = 42 docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f4dff50> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: Operator.GREATER_THAN {} ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:457 Incompatible type for strike condition: Operator.GREATER_THAN {} ______________ test_strike_incomparable_values[<-42-test_value5] _______________ operator = '<', value = 42, test_value = {} docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff9433a900> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.LESS_THAN {} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'dict' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.LESS_THAN {} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 433, in _is_match return value < strike_value ^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'dict' and 'int' ________________ test_strike_incomparable_values[>=-value6-42] _________________ operator = '>=', value = [], test_value = 42 docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff94399860> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: Operator.GREATER_THAN_OR_EQUAL [] ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:457 Incompatible type for strike condition: Operator.GREATER_THAN_OR_EQUAL [] ______________ test_strike_incomparable_values[<=-42-test_value7] ______________ operator = '<=', value = 42, test_value = [] docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f74fa10> @pytest.mark.parametrize( "operator,value,test_value", [ (">", 42, "string"), # comparing int with string ("<", "string", 42), # comparing string with int (">=", None, 42), # comparing None with int ("<=", 42, None), # comparing int with None (">", {}, 42), # comparing dict with int ("<", 42, {}), # comparing int with dict (">=", [], 42), # comparing list with int ("<=", 42, []), # comparing int with list ], ) async def test_strike_incomparable_values( operator: Operator, value: Any, test_value: Any, docket: Docket, caplog: pytest.LogCaptureFixture, ): """should handle incomparable values gracefully in strikes""" # Register a test task async def test_task(parameter: Any) -> None: pass # pragma: no cover docket.register(test_task) # Create a strike with potentially incomparable values await docket.strike("test_task", "parameter", operator, value) # We should be able to add the task without errors, even if the strike would be # comparing incomparable values > execution = await docket.add(test_task)(test_value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_striking.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- WARNING:docket.strikelist:Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL [] Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'list' and 'int' ------------------------------ Captured log call ------------------------------- WARNING docket.strikelist:strikelist.py:444 Incompatible type for strike condition: 42 Operator.LESS_THAN_OR_EQUAL [] Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/strikelist.py", line 435, in _is_match return value <= strike_value ^^^^^^^^^^^^^^^^^^^^^ TypeError: '<=' not supported between instances of 'list' and 'int' _________________ test_restored_automatic_perpetual_does_start _________________ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/test_striking.py", line 288, in test_restored_automatic_perpetual_does_start | await worker.run_at_most({"my_restored_automatic_task": 3}) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 402, in run_at_most | await self.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 557, in _worker_loop | await self._schedule_all_automatic_perpetual_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 775, in _schedule_all_automatic_perpetual_tasks | await self.docket.add( | task_function, when=perpetual.initial_when, key=key | )() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py", line 372, in scheduler | await execution.schedule(replace=False) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 485, in schedule | await schedule_script( | ...<21 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:121: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.strikelist:Striking 'my_restored_automatic_task(* == *)' INFO:docket.strikelist:Restoring 'my_restored_automatic_task(* == *)' INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* my_restored_automatic_task(...) INFO:docket.strikelist:Striking 'my_restored_automatic_task(* == *)' INFO:docket.strikelist:Restoring 'my_restored_automatic_task(* == *)' ------------------------------ Captured log call ------------------------------- INFO docket.strikelist:strikelist.py:607 Striking 'my_restored_automatic_task(* == *)' INFO docket.strikelist:strikelist.py:607 Restoring 'my_restored_automatic_task(* == *)' INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * my_restored_automatic_task(...) INFO docket.strikelist:strikelist.py:607 Striking 'my_restored_automatic_task(* == *)' INFO docket.strikelist:strikelist.py:607 Restoring 'my_restored_automatic_task(* == *)' ____________ test_assert_task_scheduled_finds_task_by_function_only ____________ docket = simple_task = async def test_assert_task_scheduled_finds_task_by_function_only( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should find a task by function alone.""" > await docket.add(simple_task)("arg1", kwarg1="value1") tests/test_testing.py:46: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________ test_assert_task_scheduled_finds_task_by_function_and_args __________ docket = simple_task = async def test_assert_task_scheduled_finds_task_by_function_and_args( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should find a task by function and args.""" > await docket.add(simple_task)("arg1", "arg2", kwarg1="value1") tests/test_testing.py:55: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_assert_task_scheduled_finds_task_by_function_and_kwargs _________ docket = simple_task = async def test_assert_task_scheduled_finds_task_by_function_and_kwargs( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should find a task by function and kwargs.""" > await docket.add(simple_task)("arg1", kwarg1="value1", kwarg2="value2") tests/test_testing.py:64: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______ test_assert_task_scheduled_finds_task_by_function_args_and_kwargs _______ docket = simple_task = async def test_assert_task_scheduled_finds_task_by_function_args_and_kwargs( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should find a task by function, args, and kwargs.""" > await docket.add(simple_task)("arg1", "arg2", kwarg1="value1") tests/test_testing.py:75: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_assert_task_scheduled_finds_task_by_key _________________ docket = simple_task = async def test_assert_task_scheduled_finds_task_by_key( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should find a task by key.""" > await docket.add(simple_task, key="my-task-key")("arg1") tests/test_testing.py:86: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_assert_task_scheduled_works_with_function_name ______________ docket = simple_task = async def test_assert_task_scheduled_works_with_function_name( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should work with function name as string.""" docket.register(simple_task) > await docket.add("simple_task")("arg1") tests/test_testing.py:96: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______ test_assert_task_scheduled_succeeds_with_multiple_matching_tasks _______ docket = simple_task = async def test_assert_task_scheduled_succeeds_with_multiple_matching_tasks( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should succeed if at least one task matches.""" > await docket.add(simple_task)("arg1", kwarg1="value1") tests/test_testing.py:105: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_assert_task_scheduled_fails_when_task_not_found _____________ docket = simple_task = another_task = async def test_assert_task_scheduled_fails_when_task_not_found( docket: Docket, simple_task: AsyncMock, another_task: AsyncMock ): """assert_task_scheduled should fail with clear error when task not found.""" > await docket.add(another_task)("arg1") tests/test_testing.py:116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_assert_task_scheduled_fails_when_args_dont_match _____________ docket = simple_task = async def test_assert_task_scheduled_fails_when_args_dont_match( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should fail when args don't match.""" > await docket.add(simple_task)("arg1", "arg2") tests/test_testing.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_assert_task_scheduled_fails_when_kwargs_dont_match ____________ docket = simple_task = async def test_assert_task_scheduled_fails_when_kwargs_dont_match( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should fail when kwargs don't match.""" > await docket.add(simple_task)(kwarg1="value1") tests/test_testing.py:136: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_assert_task_scheduled_finds_scheduled_future_task ____________ docket = simple_task = now = functools.partial(, datetime.timezone.utc) async def test_assert_task_scheduled_finds_scheduled_future_task( docket: Docket, simple_task: AsyncMock, now: Callable[[], datetime] ): """assert_task_scheduled should find tasks scheduled in the future.""" later = now() + timedelta(seconds=10) > await docket.add(simple_task, when=later)("arg1") tests/test_testing.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________ test_assert_task_not_scheduled_succeeds_when_different_task __________ docket = simple_task = another_task = async def test_assert_task_not_scheduled_succeeds_when_different_task( docket: Docket, simple_task: AsyncMock, another_task: AsyncMock ): """assert_task_not_scheduled should succeed when different task is scheduled.""" > await docket.add(another_task)("arg1") tests/test_testing.py:163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_assert_task_not_scheduled_fails_when_task_exists _____________ docket = simple_task = async def test_assert_task_not_scheduled_fails_when_task_exists( docket: Docket, simple_task: AsyncMock ): """assert_task_not_scheduled should fail when task is scheduled.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:172: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_assert_task_not_scheduled_with_specific_args _______________ docket = simple_task = async def test_assert_task_not_scheduled_with_specific_args( docket: Docket, simple_task: AsyncMock ): """assert_task_not_scheduled should check specific args.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:182: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_assert_task_count_all_tasks _______________________ docket = simple_task = another_task = async def test_assert_task_count_all_tasks( docket: Docket, simple_task: AsyncMock, another_task: AsyncMock ): """assert_task_count should count all tasks when no function specified.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_assert_task_count_for_specific_function _________________ docket = simple_task = another_task = async def test_assert_task_count_for_specific_function( docket: Docket, simple_task: AsyncMock, another_task: AsyncMock ): """assert_task_count should count tasks for a specific function.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_assert_task_count_fails_with_wrong_count _________________ docket = simple_task = async def test_assert_task_count_fails_with_wrong_count( docket: Docket, simple_task: AsyncMock ): """assert_task_count should fail with clear error when count is wrong.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_assert_task_count_with_function_name ___________________ docket = simple_task = async def test_assert_task_count_with_function_name( docket: Docket, simple_task: AsyncMock ): """assert_task_count should work with function name as string.""" docket.register(simple_task) > await docket.add("simple_task")("arg1") tests/test_testing.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_assert_no_tasks_fails_when_tasks_present _________________ docket = simple_task = async def test_assert_no_tasks_fails_when_tasks_present( docket: Docket, simple_task: AsyncMock ): """assert_no_tasks should fail when tasks are present.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_assert_no_tasks_after_tasks_complete ___________________ docket = worker = simple_task = async def test_assert_no_tasks_after_tasks_complete( docket: Docket, worker: Worker, simple_task: AsyncMock ): """assert_no_tasks should succeed after tasks complete.""" > await docket.add(simple_task)("arg1") tests/test_testing.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_assert_task_scheduled_partial_kwargs_match ________________ docket = simple_task = async def test_assert_task_scheduled_partial_kwargs_match( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should match subset of kwargs.""" > await docket.add(simple_task)(kwarg1="value1", kwarg2="value2", kwarg3="value3") tests/test_testing.py:279: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________ test_assert_task_count_includes_future_and_immediate_tasks __________ docket = simple_task = now = functools.partial(, datetime.timezone.utc) async def test_assert_task_count_includes_future_and_immediate_tasks( docket: Docket, simple_task: AsyncMock, now: Callable[[], datetime] ): """assert_task_count should count both immediate and future tasks.""" > await docket.add(simple_task)("immediate") tests/test_testing.py:292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_assert_task_scheduled_fails_when_key_doesnt_match ____________ docket = simple_task = async def test_assert_task_scheduled_fails_when_key_doesnt_match( docket: Docket, simple_task: AsyncMock ): """assert_task_scheduled should fail when key doesn't match.""" > await docket.add(simple_task, key="task-1")("arg1") tests/test_testing.py:304: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_redis_key_cleanup_successful_task ____________________ docket = worker = key_leak_checker = async def test_redis_key_cleanup_successful_task( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker ) -> None: """Test that Redis keys are properly cleaned up after successful task execution. After execution, a tombstone (runs hash) with COMPLETED state remains with TTL. The autouse key_leak_checker fixture verifies no leaks automatically. """ # Create and register a simple task task_executed = False async def successful_task(): nonlocal task_executed task_executed = True await asyncio.sleep(0.01) # Small delay to ensure proper execution flow docket.register(successful_task) # Schedule and execute the task > await docket.add(successful_task)() tests/worker/test_bootstrap.py:33: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_redis_key_cleanup_failed_task ______________________ docket = worker = key_leak_checker = async def test_redis_key_cleanup_failed_task( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker ) -> None: """Test that Redis keys are properly cleaned up after failed task execution. After failure, a tombstone (runs hash) with FAILED state remains with TTL. The autouse key_leak_checker fixture verifies no leaks automatically. """ # Create a task that will fail task_attempted = False async def failing_task(): nonlocal task_attempted task_attempted = True raise ValueError("Intentional test failure") docket.register(failing_task) # Schedule and execute the task (should fail) > await docket.add(failing_task)() tests/worker/test_bootstrap.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_redis_key_cleanup_cancelled_task _____________________ docket = worker = key_leak_checker = async def test_redis_key_cleanup_cancelled_task( docket: Docket, worker: Worker, key_leak_checker: KeyCountChecker ) -> None: """Test that Redis keys are properly cleaned up after task cancellation. After cancellation, a tombstone (runs hash) with CANCELLED state remains with TTL to support the claim check pattern via get_execution(). All other keys (queue, parked data, etc.) are cleaned up. The autouse key_leak_checker fixture verifies no leaks automatically. """ from docket.execution import ExecutionState # Create a task that won't be executed task_executed = False async def task_to_cancel(): nonlocal task_executed task_executed = True # pragma: no cover docket.register(task_to_cancel) # Schedule the task for future execution future_time = datetime.now(timezone.utc) + timedelta(seconds=10) > execution = await docket.add(task_to_cancel, future_time)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_bootstrap.py:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________ test_consumer_group_created_on_first_worker_read _______________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff8ef53ab0> async def test_consumer_group_created_on_first_worker_read( redis_url: str, make_docket_name: Callable[[], str] ): """Consumer group should be created when worker first tries to read. Issue #206: Lazy stream/consumer group bootstrap. """ docket = Docket(name=make_docket_name(), url=redis_url) async def dummy_task(): pass async with docket: docket.register(dummy_task) > await docket.add(dummy_task)() tests/worker/test_bootstrap.py:149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_multiple_workers_racing_to_create_group _________________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff8ef51590> async def test_multiple_workers_racing_to_create_group( redis_url: str, make_docket_name: Callable[[], str] ): """Multiple workers starting simultaneously should all succeed. Issue #206: Lazy stream/consumer group bootstrap. """ docket = Docket(name=make_docket_name(), url=redis_url) call_counts: dict[str, int] = {} async def counting_task(worker: Worker = CurrentWorker()): call_counts[worker.name] = call_counts.get(worker.name, 0) + 1 async with docket: docket.register(counting_task) for _ in range(20): > await docket.add(counting_task)() tests/worker/test_bootstrap.py:186: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_worker_handles_nogroup_error_gracefully _________________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff8ef50930> async def test_worker_handles_nogroup_error_gracefully( redis_url: str, make_docket_name: Callable[[], str] ): """Worker should handle NOGROUP error and create group automatically. Issue #206: Lazy stream/consumer group bootstrap. """ docket = Docket(name=make_docket_name(), url=redis_url) task_executed = False async def simple_task(): nonlocal task_executed task_executed = True async with docket: docket.register(simple_task) > await docket.add(simple_task)() tests/worker/test_bootstrap.py:230: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_worker_handles_nogroup_in_xreadgroup ___________________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff8ef50ca0> caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f25ff50> async def test_worker_handles_nogroup_in_xreadgroup( redis_url: str, make_docket_name: Callable[[], str], caplog: pytest.LogCaptureFixture, ): """Worker should handle NOGROUP error in xreadgroup and retry. Issue #206: Lazy stream/consumer group bootstrap. This tests the rare case where xautoclaim succeeds but then xreadgroup gets NOGROUP (e.g., if the group was deleted between the two calls). """ from unittest.mock import patch import redis.asyncio from redis.exceptions import ResponseError docket = Docket(name=make_docket_name(), url=redis_url) task_executed = False async def simple_task(): nonlocal task_executed task_executed = True async with docket: docket.register(simple_task) # Add a task so the worker has something to process > await docket.add(simple_task)() tests/worker/test_bootstrap.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_worker_acknowledges_messages _______________________ docket = worker = the_task = async def test_worker_acknowledges_messages( docket: Docket, worker: Worker, the_task: AsyncMock ): """The worker should acknowledge and drain messages as they're processed""" > await docket.add(the_task)() tests/worker/test_core.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________________ test_two_workers_split_work __________________________ docket = async def test_two_workers_split_work(docket: Docket): """Two workers should split the workload""" # Use concurrency=1 so workers claim tasks one at a time for finer distribution worker1 = Worker(docket, concurrency=1) worker2 = Worker(docket, concurrency=1) call_counts = { worker1: 0, worker2: 0, } # Tasks wait for this event, ensuring both workers claim work before any completes proceed = asyncio.Event() async def the_task(worker: Worker = CurrentWorker()): await proceed.wait() call_counts[worker] += 1 for _ in range(100): > await docket.add(the_task)() tests/worker/test_core.py:62: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_worker_reconnects_when_connection_is_lost ________________ docket = the_task = async def test_worker_reconnects_when_connection_is_lost( docket: Docket, the_task: AsyncMock ): """The worker should reconnect when the connection is lost""" worker = Worker(docket, reconnection_delay=timedelta(milliseconds=100)) # Mock the _worker_loop method to fail once then succeed original_worker_loop = worker._worker_loop # type: ignore[protected-access] call_count = 0 async def mock_worker_loop(redis: Redis, forever: bool = False): nonlocal call_count call_count += 1 if call_count == 1: raise ConnectionError("Simulated connection error") return await original_worker_loop(redis, forever=forever) worker._worker_loop = mock_worker_loop # type: ignore[protected-access] > await docket.add(the_task)() tests/worker/test_core.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________________ test_worker_respects_concurrency_limit ____________________ docket = worker = async def test_worker_respects_concurrency_limit(docket: Docket, worker: Worker): """Worker should not exceed its configured concurrency limit""" task_results: set[int] = set() currently_running = 0 max_concurrency_observed = 0 async def concurrency_tracking_task(index: int): nonlocal currently_running, max_concurrency_observed currently_running += 1 max_concurrency_observed = max(max_concurrency_observed, currently_running) await asyncio.sleep(0.1) # Long enough to overlap even on slow CI runners task_results.add(index) currently_running -= 1 for i in range(50): > await docket.add(concurrency_tracking_task)(index=i) tests/worker/test_core.py:128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____ test_worker_handles_unregistered_task_execution_on_initial_delivery ______ docket = worker = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8f2cbcb0> the_task = async def test_worker_handles_unregistered_task_execution_on_initial_delivery( docket: Docket, worker: Worker, caplog: pytest.LogCaptureFixture, the_task: AsyncMock, ): """worker should handle the case when an unregistered task is executed""" > await docket.add(the_task)() tests/worker/test_core.py:145: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________ test_worker_handles_unregistered_task_execution_on_redelivery _________ docket = caplog = <_pytest.logging.LogCaptureFixture object at 0x7fff8ec41e80> async def test_worker_handles_unregistered_task_execution_on_redelivery( docket: Docket, caplog: pytest.LogCaptureFixture, ): """worker should handle the case when an unregistered task is redelivered""" async def test_task(): await asyncio.sleep(0.01) # Register and schedule the task first docket.register(test_task) > await docket.add(test_task)() tests/worker/test_core.py:168: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_worker_concurrency_cleanup_without_dependencies _____________ docket = async def test_worker_concurrency_cleanup_without_dependencies(docket: Docket): """Test worker cleanup when dependencies are not defined.""" cleanup_executed = False async def simple_task(): nonlocal cleanup_executed # Force an exception after dependencies would be set raise ValueError("Force cleanup path") > await docket.add(simple_task)() tests/worker/test_core.py:393: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_worker_concurrency_no_limit_with_custom_docket ______________ docket = async def test_worker_concurrency_no_limit_with_custom_docket(docket: Docket): """Test early return when task has no concurrency limit using custom docket.""" task_executed = False async def task_without_concurrency(): nonlocal task_executed task_executed = True > await docket.add(task_without_concurrency)() tests/worker/test_core.py:412: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________________ test_worker_exception_before_dependencies ___________________ docket = async def test_worker_exception_before_dependencies(docket: Docket): """Test finally block when exception occurs before dependencies are set.""" task_failed = False async def task_that_will_fail(): nonlocal task_failed task_failed = True raise RuntimeError("Test exception for coverage") try: await task_that_will_fail() except RuntimeError: pass # Reset flag to test worker behavior task_failed = False # Mock resolved_dependencies to fail before setting dependencies > await docket.add(task_that_will_fail)() tests/worker/test_core.py:439: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_invariant_tasks_by_key_empty_after_completion ______________ docket = async def test_invariant_tasks_by_key_empty_after_completion(docket: Docket): """After run_until_finished, _tasks_by_key should be empty (all tasks done).""" async def simple_task(): pass docket.register(simple_task) async with Worker(docket, concurrency=4) as worker: for _ in range(50): > await docket.add(simple_task)() tests/worker/test_invariants.py:30: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________ test_invariant_tasks_by_key_no_growth_over_batches ______________ docket = async def test_invariant_tasks_by_key_no_growth_over_batches(docket: Docket): """Running multiple batches should not accumulate entries in _tasks_by_key.""" async def simple_task(): pass docket.register(simple_task) async with Worker(docket, concurrency=4) as worker: for batch in range(5): for _ in range(20): > await docket.add(simple_task)() tests/worker/test_invariants.py:49: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_invariant_execution_counts_empty_after_completion ____________ docket = async def test_invariant_execution_counts_empty_after_completion(docket: Docket): """_execution_counts should be empty after normal run_until_finished (no run_at_most).""" async def simple_task(): pass docket.register(simple_task) async with Worker(docket, concurrency=4) as worker: for _ in range(10): > await docket.add(simple_task)() tests/worker/test_invariants.py:66: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError __________ test_invariant_execution_counts_cleared_after_run_at_most ___________ docket = async def test_invariant_execution_counts_cleared_after_run_at_most(docket: Docket): """_execution_counts should be cleared after run_at_most completes.""" iteration_count = 0 async def perpetual_task( perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=10)), ): nonlocal iteration_count iteration_count += 1 docket.register(perpetual_task) > await docket.add(perpetual_task, key="test-perpetual")() tests/worker/test_invariants.py:85: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_invariant_cleanup_after_task_exceptions _________________ docket = async def test_invariant_cleanup_after_task_exceptions(docket: Docket): """_tasks_by_key should be cleaned up even when tasks raise exceptions.""" async def failing_task(): raise ValueError("intentional failure") docket.register(failing_task) async with Worker(docket, concurrency=4) as worker: for _ in range(10): > await docket.add(failing_task)() tests/worker/test_invariants.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_invariant_cleanup_with_varied_tasks ___________________ docket = async def test_invariant_cleanup_with_varied_tasks(docket: Docket): """Cleanup should work with all task types: deps, timeouts, returns, kwargs.""" async def simple_task(): pass async def task_with_deps( execution: Execution = CurrentExecution(), logger: logging.LoggerAdapter[logging.Logger] = TaskLogger(), ): logger.info(f"Running {execution.key}") async def task_with_timeout( timeout: Timeout = Timeout(timedelta(seconds=5)), ): await asyncio.sleep(0.01) async def task_with_return() -> str: return "result" async def task_with_kwargs(a: int, b: str = "default"): pass for task in [ simple_task, task_with_deps, task_with_timeout, task_with_return, task_with_kwargs, ]: docket.register(task) async with Worker(docket, concurrency=4) as worker: # Add varied tasks for _ in range(5): > await docket.add(simple_task)() tests/worker/test_invariants.py:176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_run_forever_cancels_promptly_with_future_tasks ______________ docket = the_task = now = functools.partial(, datetime.timezone.utc) async def test_run_forever_cancels_promptly_with_future_tasks( docket: Docket, the_task: AsyncMock, now: Callable[[], datetime] ): """run_forever() should cancel promptly even with future-scheduled tasks. Issue #260: Perpetual tasks block worker shutdown. """ > execution = await docket.add(the_task, now() + timedelta(seconds=15))() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_lifecycle.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_run_until_finished_exits_promptly_with_future_tasks ___________ docket = the_task = now = functools.partial(, datetime.timezone.utc) async def test_run_until_finished_exits_promptly_with_future_tasks( docket: Docket, the_task: AsyncMock, now: Callable[[], datetime] ): """run_until_finished() should exit promptly when only future tasks exist. Issue #260: Perpetual tasks block worker shutdown. """ > execution = await docket.add(the_task, now() + timedelta(seconds=15))() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_lifecycle.py:53: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_run_at_most_cancels_promptly_with_future_tasks ______________ docket = the_task = now = functools.partial(, datetime.timezone.utc) async def test_run_at_most_cancels_promptly_with_future_tasks( docket: Docket, the_task: AsyncMock, now: Callable[[], datetime] ): """run_at_most() should cancel promptly even with future-scheduled tasks. Issue #260: Perpetual tasks block worker shutdown. """ > execution = await docket.add(the_task, now() + timedelta(seconds=15))() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_lifecycle.py:74: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_worker_drains_active_tasks_on_shutdown __________________ docket = async def test_worker_drains_active_tasks_on_shutdown(docket: Docket): """Active tasks are gathered and processed in the finally block at shutdown. Uses an event handshake so the task is guaranteed to still be running when the worker is cancelled. The finally block's asyncio.gather waits for the task, and we release it from a separate coroutine. """ task_started = asyncio.Event() task_can_finish = asyncio.Event() task_drained = False async def blocking_task(): nonlocal task_drained task_started.set() await task_can_finish.wait() task_drained = True > await docket.add(blocking_task)() tests/worker/test_lifecycle.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________ test_perpetual_tasks_are_scheduled_close_to_target_time ____________ docket = worker = async def test_perpetual_tasks_are_scheduled_close_to_target_time( docket: Docket, worker: Worker ): """A perpetual task is scheduled as close to the target period as possible""" timestamps: list[datetime] = [] async def perpetual_task( a: str, b: int, perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), ): timestamps.append(datetime.now(timezone.utc)) > await docket.add(perpetual_task, key="my-key")(a="a", b=2) tests/worker/test_scheduling.py:29: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______ test_worker_can_exit_from_perpetual_tasks_that_queue_further_tasks ______ docket = worker = async def test_worker_can_exit_from_perpetual_tasks_that_queue_further_tasks( docket: Docket, worker: Worker ): """A worker can exit if it's processing a perpetual task that queues more tasks""" inner_calls = 0 async def inner_task(): nonlocal inner_calls inner_calls += 1 async def perpetual_task( docket: Docket = CurrentDocket(), perpetual: Perpetual = Perpetual(every=timedelta(milliseconds=50)), ): await docket.add(inner_task)() await docket.add(inner_task)() > execution = await docket.add(perpetual_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_scheduling.py:66: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ____________ test_worker_can_exit_from_long_horizon_perpetual_tasks ____________ docket = worker = async def test_worker_can_exit_from_long_horizon_perpetual_tasks( docket: Docket, worker: Worker ): """A worker can exit in a timely manner from a perpetual task that has a long horizon because it is stricken on both execution and rescheduling""" calls: int = 0 async def perpetual_task( a: str, b: int, perpetual: Perpetual = Perpetual(every=timedelta(weeks=37)), ): nonlocal calls calls += 1 > await docket.add(perpetual_task, key="my-key")(a="a", b=2) tests/worker/test_scheduling.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________ test_worker_timeout_exceeds_redelivery_timeout ________________ docket = async def test_worker_timeout_exceeds_redelivery_timeout(docket: Docket): """Test worker handles user timeout longer than redelivery timeout.""" task_executed = False async def test_task( timeout: Timeout = Timeout(timedelta(seconds=5)), ): nonlocal task_executed task_executed = True await asyncio.sleep(0.01) > await docket.add(test_task)() tests/worker/test_scheduling.py:122: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _________________ test_replacement_race_condition_stream_tasks _________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_replacement_race_condition_stream_tasks( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Test that replace() properly cancels tasks already in the stream. This reproduces the race condition where: 1. Task is scheduled for immediate execution 2. Scheduler moves it to stream 3. replace() tries to cancel but only checks queue/hash, not stream 4. Both original and replacement tasks execute """ key = f"my-cool-task:{uuid4()}" # Schedule a task immediately (will be moved to stream quickly) > await docket.add(the_task, now(), key=key)("a", "b", c="c") tests/worker/test_scheduling.py:145: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________ test_replace_task_in_queue_before_stream ___________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_replace_task_in_queue_before_stream( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Test that replace() works correctly when task is still in queue.""" key = f"my-cool-task:{uuid4()}" # Schedule a task slightly in the future (stays in queue) soon = now() + timedelta(seconds=1) > await docket.add(the_task, soon, key=key)("a", "b", c="c") tests/worker/test_scheduling.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_rapid_replace_operations _________________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_rapid_replace_operations( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Test multiple rapid replace operations.""" key = f"my-cool-task:{uuid4()}" # Schedule initial task > await docket.add(the_task, now(), key=key)("a", "b", c="c") tests/worker/test_scheduling.py:193: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___ test_duplicate_execution_race_condition_non_perpetual_task[default_ttl] ____ redis_url = 'memory://', execution_ttl = None make_docket_name = ._make_name at 0x7fff8e84ecf0> @pytest.mark.parametrize( "execution_ttl", [None, timedelta(0)], ids=["default_ttl", "zero_ttl"] ) async def test_duplicate_execution_race_condition_non_perpetual_task( redis_url: str, execution_ttl: timedelta | None, make_docket_name: Callable[[], str] ): """Reproduce race condition where non-perpetual tasks execute multiple times. Bug: known_task_key is deleted BEFORE task function runs (worker.py:588), allowing duplicate docket.add() calls with the same key to succeed while the original task is still executing. Timeline: 1. Task A scheduled with key="task:123" -> known_key set 2. Worker picks up Task A, _perpetuate_if_requested() returns False 3. Worker calls _delete_known_task() -> known_key DELETED 4. Worker starts executing the actual task function (slow task) 5. Meanwhile, docket.add(key="task:123") checks EXISTS known_key -> 0 6. Duplicate task scheduled and picked up by concurrent worker 7. Both tasks execute in parallel Tests both default TTL and execution_ttl=0 to ensure fix doesn't depend on volatile results keys. """ execution_count = 0 task_started = asyncio.Event() async def slow_task(task_id: str): nonlocal execution_count execution_count += 1 task_started.set() await asyncio.sleep(0.3) docket_kwargs: dict[str, object] = { "name": make_docket_name(), "url": redis_url, } if execution_ttl is not None: docket_kwargs["execution_ttl"] = execution_ttl async with Docket(**docket_kwargs) as docket: # type: ignore[arg-type] docket.register(slow_task) task_key = f"race-test:{uuid4()}" async with Worker(docket, concurrency=2) as worker: worker_task = asyncio.create_task(worker.run_until_finished()) # Schedule first task > await docket.add(slow_task, key=task_key)("first") tests/worker/test_scheduling.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* slow_task(task_id: str) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * slow_task(task_id: str) _____ test_duplicate_execution_race_condition_non_perpetual_task[zero_ttl] _____ redis_url = 'memory://', execution_ttl = datetime.timedelta(0) make_docket_name = ._make_name at 0x7fff8e84eda0> @pytest.mark.parametrize( "execution_ttl", [None, timedelta(0)], ids=["default_ttl", "zero_ttl"] ) async def test_duplicate_execution_race_condition_non_perpetual_task( redis_url: str, execution_ttl: timedelta | None, make_docket_name: Callable[[], str] ): """Reproduce race condition where non-perpetual tasks execute multiple times. Bug: known_task_key is deleted BEFORE task function runs (worker.py:588), allowing duplicate docket.add() calls with the same key to succeed while the original task is still executing. Timeline: 1. Task A scheduled with key="task:123" -> known_key set 2. Worker picks up Task A, _perpetuate_if_requested() returns False 3. Worker calls _delete_known_task() -> known_key DELETED 4. Worker starts executing the actual task function (slow task) 5. Meanwhile, docket.add(key="task:123") checks EXISTS known_key -> 0 6. Duplicate task scheduled and picked up by concurrent worker 7. Both tasks execute in parallel Tests both default TTL and execution_ttl=0 to ensure fix doesn't depend on volatile results keys. """ execution_count = 0 task_started = asyncio.Event() async def slow_task(task_id: str): nonlocal execution_count execution_count += 1 task_started.set() await asyncio.sleep(0.3) docket_kwargs: dict[str, object] = { "name": make_docket_name(), "url": redis_url, } if execution_ttl is not None: docket_kwargs["execution_ttl"] = execution_ttl async with Docket(**docket_kwargs) as docket: # type: ignore[arg-type] docket.register(slow_task) task_key = f"race-test:{uuid4()}" async with Worker(docket, concurrency=2) as worker: worker_task = asyncio.create_task(worker.run_until_finished()) # Schedule first task > await docket.add(slow_task, key=task_key)("first") tests/worker/test_scheduling.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:* slow_task(task_id: str) ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:1007 * slow_task(task_id: str) _______________ test_wrongtype_error_with_legacy_known_task_key ________________ + Exception Group Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/pydocket-0.17.9/tests/worker/test_scheduling.py", line 322, in test_wrongtype_error_with_legacy_known_task_key | await worker.run_until_finished() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 368, in run_until_finished | return await self._run(forever=False) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 413, in _run | return await self._worker_loop(redis, forever=forever) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 548, in _worker_loop | async with TaskGroup() as infra: | ~~~~~~~~~^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 72, in __aexit__ | return await self._aexit(et, exc) | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib64/python3.14/asyncio/taskgroups.py", line 174, in _aexit | raise BaseExceptionGroup( | ...<2 lines>... | ) from None | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute | await execution.mark_as_completed(result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed | await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk | | During handling of the above exception, another exception occurred: | | Traceback (most recent call last): | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 571, in _worker_loop | await process_completed_tasks() | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 520, in process_completed_tasks | await task | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 998, in _execute | await execution.mark_as_failed(error_msg, result_key=result_key) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 748, in mark_as_failed | await self._mark_as_terminal( | ExecutionState.FAILED, error=error, result_key=result_key | ) | File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal | await terminal_script( | ...<8 lines>... | ) | File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ | return await client.evalsha(self.sha, len(keys), *args) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command | return await conn.retry.call_with_retry( | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...<4 lines>... | ) | ^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry | return await do() | ^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response | return await self.parse_response(conn, command_name, **options) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response | response = await connection.read_response() | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response | raise response | redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') | stack traceback: | [string ""]:22: in main chunk +------------------------------------ ----------------------------- Captured stderr call ----------------------------- INFO:docket.worker:Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO:docket.worker:* trace(message: str, ...) INFO:docket.worker:* fail(message: str, ...) INFO:docket.worker:* sleep(seconds: float, ...) INFO:docket.worker:↪ [ 60ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} INFO:docket.task.trace:legacy task test: 'legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc' added to docket 'test-docket-c62672df-c764-4d0c-b6ab-6bd0787f9ae4' 0:00:00.060194 ago now running on worker 'bde710d1552f45a3b346e4ebac757aed#574' ERROR:docket.worker:↩ [ 1ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ------------------------------ Captured log call ------------------------------- INFO docket.worker:worker.py:1005 Starting worker 'bde710d1552f45a3b346e4ebac757aed#574' with the following tasks: INFO docket.worker:worker.py:1007 * trace(message: str, ...) INFO docket.worker:worker.py:1007 * fail(message: str, ...) INFO docket.worker:worker.py:1007 * sleep(seconds: float, ...) INFO docket.worker:worker.py:828 ↪ [ 60ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} INFO docket.task.trace:tasks.py:24 legacy task test: 'legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc' added to docket 'test-docket-c62672df-c764-4d0c-b6ab-6bd0787f9ae4' 0:00:00.060194 ago now running on worker 'bde710d1552f45a3b346e4ebac757aed#574' ERROR docket.worker:worker.py:975 ↩ [ 1ms] trace(...){legacy-task:cc272bb1-d1f3-43db-8ef6-0e23eeffe6bc} Traceback (most recent call last): File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/worker.py", line 920, in _execute await execution.mark_as_completed(result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 737, in mark_as_completed await self._mark_as_terminal(ExecutionState.COMPLETED, result_key=result_key) File "/builddir/build/BUILD/python-pydocket-0.17.9-build/BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py", line 706, in _mark_as_terminal await terminal_script( ...<8 lines>... ) File "/usr/lib/python3.14/site-packages/redis/commands/core.py", line 5572, in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 616, in execute_command return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.14/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 590, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/redis/asyncio/client.py", line 637, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.14/site-packages/fakeredis/aioredis.py", line 185, in read_response raise response redis.exceptions.ResponseError: Error running script (call to f_8b51cd099d9bcd3ff5c790481013ff8b9acfc6d2): @user_script:?: [string ""]:22: attempt to call a nil value (global 'unpack') stack traceback: [string ""]:22: in main chunk ___________________ test_replace_task_with_legacy_known_key ____________________ docket = worker = the_task = now = functools.partial(, datetime.timezone.utc) async def test_replace_task_with_legacy_known_key( docket: Docket, worker: Worker, the_task: AsyncMock, now: Callable[[], datetime] ): """Test that replace() works with legacy string known_keys. This reproduces the exact production scenario where replace() would get WRONGTYPE errors when trying to HGET on legacy string known_keys. The main goal is to verify no WRONGTYPE error occurs. """ key = f"legacy-replace-task:{uuid4()}" # Simulate legacy state: create known_key as string (old format) async with docket.redis() as redis: known_task_key = docket.known_task_key(key) when = now() # Create legacy known_key as STRING (what old code did) await redis.set(known_task_key, str(when.timestamp())) # Now try to replace - this should work without WRONGTYPE error # The key point is that this call succeeds without throwing WRONGTYPE replacement_time = now() + timedelta(seconds=1) > await docket.replace("trace", replacement_time, key=key)("replacement message") tests/worker/test_scheduling.py:363: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:469: in scheduler await execution.schedule(replace=True) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:135: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:135: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:135: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _______________________ test_task_executes_with_ttl_zero _______________________ zero_ttl_docket = async def test_task_executes_with_ttl_zero(zero_ttl_docket: Docket) -> None: """Tasks should execute successfully when execution_ttl is set to 0.""" executed: list[int] = [] async def simple_task(value: int) -> int: executed.append(value) return value * 2 zero_ttl_docket.register(simple_task) async with Worker(docket=zero_ttl_docket) as worker: > execution = await zero_ttl_docket.add(simple_task)(42) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_ttl_zero.py:21: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError _____________ test_state_record_expires_immediately_with_ttl_zero ______________ zero_ttl_docket = async def test_state_record_expires_immediately_with_ttl_zero( zero_ttl_docket: Docket, ) -> None: """State records should be deleted immediately when execution_ttl is 0.""" async def simple_task() -> str: return "done" zero_ttl_docket.register(simple_task) async with Worker(docket=zero_ttl_docket) as worker: > execution = await zero_ttl_docket.add(simple_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_ttl_zero.py:39: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ______________________ test_result_storage_with_ttl_zero _______________________ zero_ttl_docket = async def test_result_storage_with_ttl_zero(zero_ttl_docket: Docket) -> None: """Results should be stored with TTL of 0 when execution_ttl is 0.""" async def task_with_result() -> str: return "result" zero_ttl_docket.register(task_with_result) async with Worker(docket=zero_ttl_docket) as worker: > execution = await zero_ttl_docket.add(task_with_result)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_ttl_zero.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ________________________ test_failed_task_with_ttl_zero ________________________ zero_ttl_docket = async def test_failed_task_with_ttl_zero(zero_ttl_docket: Docket) -> None: """Failed tasks should handle TTL=0 correctly.""" async def failing_task() -> None: raise ValueError("intentional failure") zero_ttl_docket.register(failing_task) async with Worker(docket=zero_ttl_docket) as worker: > execution = await zero_ttl_docket.add(failing_task)() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_ttl_zero.py:80: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError ___________________________ test_mixed_ttl_workload ____________________________ redis_url = 'memory://' make_docket_name = ._make_name at 0x7fff8f16b5e0> async def test_mixed_ttl_workload( redis_url: str, make_docket_name: Callable[[], str] ) -> None: """Tasks with different TTL settings should not interfere with each other.""" async with ( # pragma: no branch Docket( name=make_docket_name(), url=redis_url, execution_ttl=timedelta(seconds=60), ) as docket_with_ttl, Docket( name=make_docket_name(), url=redis_url, execution_ttl=timedelta(0), ) as docket_with_zero_ttl, ): results_with_ttl: list[int] = [] results_zero_ttl: list[int] = [] async def task_with_ttl(value: int) -> int: results_with_ttl.append(value) return value * 2 async def task_zero_ttl(value: int) -> int: results_zero_ttl.append(value) return value * 3 docket_with_ttl.register(task_with_ttl) docket_with_zero_ttl.register(task_zero_ttl) async with ( # pragma: no branch Worker(docket=docket_with_ttl) as worker_with_ttl, Worker(docket=docket_with_zero_ttl) as worker_with_zero_ttl, ): # Schedule tasks on both dockets > exec_with_ttl = await docket_with_ttl.add(task_with_ttl)(10) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/worker/test_ttl_zero.py:127: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../BUILDROOT/usr/lib/python3.14/site-packages/docket/docket.py:372: in scheduler await execution.schedule(replace=False) ../BUILDROOT/usr/lib/python3.14/site-packages/docket/execution.py:485: in schedule await schedule_script( /usr/lib/python3.14/site-packages/redis/commands/core.py:5572: in __call__ return await client.evalsha(self.sha, len(keys), *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:616: in execute_command return await conn.retry.call_with_retry( /usr/lib/python3.14/site-packages/redis/asyncio/retry.py:59: in call_with_retry return await do() ^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:590: in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/site-packages/redis/asyncio/client.py:637: in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ,db=0)> kwargs = {} response = ResponseError('Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global \'unpack\')\nstack traceback:\n\t[string ""]:121: in main chunk') timeout = None, can_read = True async def read_response(self, **kwargs: Any) -> Any: # type: ignore if not self._sock: raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) if not self._server.connected: try: response = self._sock.responses.get_nowait() except asyncio.QueueEmpty: if kwargs.get("disconnect_on_error", True): await self.disconnect() raise redis_async.ConnectionError(msgs.CONNECTION_ERROR_MSG) else: timeout: Optional[float] = kwargs.pop("timeout", None) can_read = await self.can_read(timeout) response = await self._reader.read(0) if can_read and self._reader else None if isinstance(response, RaiseErrorTypes): > raise response E redis.exceptions.ResponseError: Error running script (call to f_cd7d9c77b5fec5a727af7a246da08acd8b105e5a): @user_script:?: [string ""]:121: attempt to call a nil value (global 'unpack') E stack traceback: E [string ""]:121: in main chunk /usr/lib/python3.14/site-packages/fakeredis/aioredis.py:185: ResponseError =========================== short test summary info ============================ FAILED tests/cli/test_worker.py::test_rich_logging_format - redis.exceptions.... FAILED tests/cli/test_worker.py::test_plain_logging_format - redis.exceptions... FAILED tests/concurrency_limits/test_basic.py::test_basic_concurrency_limit FAILED tests/concurrency_limits/test_basic.py::test_per_task_concurrency_limit FAILED tests/concurrency_limits/test_basic.py::test_concurrency_limit_single_argument FAILED tests/concurrency_limits/test_basic.py::test_concurrency_limit_different_arguments FAILED tests/concurrency_limits/test_basic.py::test_concurrency_limit_max_concurrent FAILED tests/concurrency_limits/test_basic.py::test_concurrency_limit_missing_argument_error FAILED tests/concurrency_limits/test_basic.py::test_concurrency_limit_with_custom_scope FAILED tests/concurrency_limits/test_basic.py::test_concurrency_limit_without_concurrency_dependency FAILED tests/concurrency_limits/test_basic.py::test_concurrency_keys_are_handled FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_with_task_failures FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_error_handling_during_execution FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_multiple_workers_coordination FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_refresh_handles_redis_errors FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_robustness_under_stress FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_edge_cases FAILED tests/concurrency_limits/test_errors_and_resilience.py::test_worker_graceful_shutdown_with_concurrency_management FAILED tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_limits_task_queuing_behavior FAILED tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_different_customer_branches FAILED tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_limits_different_scopes FAILED tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_refresh_mechanism_integration FAILED tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_with_quick_tasks FAILED tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_with_dependencies_integration FAILED tests/concurrency_limits/test_execution_patterns.py::test_concurrency_limited_task_successfully_acquires_slot FAILED tests/concurrency_limits/test_redelivery.py::test_task_timeout_with_explicit_timeout FAILED tests/concurrency_limits/test_redelivery.py::test_task_timeout_with_concurrent_tasks FAILED tests/concurrency_limits/test_redelivery.py::test_explicit_timeout_limits_long_tasks FAILED tests/concurrency_limits/test_redelivery.py::test_short_tasks_complete_within_timeout FAILED tests/concurrency_limits/test_redelivery.py::test_redeliveries_respect_concurrency_limits FAILED tests/concurrency_limits/test_redelivery.py::test_concurrency_blocked_task_executes_exactly_once FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_missing_argument_fails_task FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_no_limit_early_return FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_missing_argument_shows_available_args FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_cleanup_on_success FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_cleanup_on_failure FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_cleanup_after_task_completion FAILED tests/concurrency_limits/test_worker_mechanics.py::test_worker_handles_concurrent_task_cleanup_gracefully FAILED tests/concurrency_limits/test_worker_mechanics.py::test_finally_block_releases_concurrency_on_success FAILED tests/concurrency_limits/test_worker_mechanics.py::test_stale_concurrency_slots_are_scavenged_when_full FAILED tests/concurrency_limits/test_worker_mechanics.py::test_graceful_shutdown_releases_concurrency_slots FAILED tests/fundamentals/test_async_dependencies.py::test_simple_function_dependencies FAILED tests/fundamentals/test_async_dependencies.py::test_contextual_dependencies FAILED tests/fundamentals/test_async_dependencies.py::test_dependencies_of_dependencies FAILED tests/fundamentals/test_async_dependencies.py::test_dependencies_can_ask_for_docket_dependencies FAILED tests/fundamentals/test_async_dependencies.py::test_dependency_failures_are_task_failures FAILED tests/fundamentals/test_async_dependencies.py::test_contextual_dependency_before_failures_are_task_failures FAILED tests/fundamentals/test_async_dependencies.py::test_contextual_dependency_after_failures_are_task_failures FAILED tests/fundamentals/test_async_dependencies.py::test_dependencies_can_ask_for_task_arguments FAILED tests/fundamentals/test_async_dependencies.py::test_task_arguments_may_be_optional FAILED tests/fundamentals/test_builtin_tasks.py::test_all_dockets_have_a_trace_task FAILED tests/fundamentals/test_builtin_tasks.py::test_all_dockets_have_a_fail_task FAILED tests/fundamentals/test_cancellation.py::test_cancelling_future_task FAILED tests/fundamentals/test_cancellation.py::test_cancelling_immediate_task FAILED tests/fundamentals/test_cancellation.py::test_cancellation_is_idempotent FAILED tests/fundamentals/test_context_injection.py::test_supports_requesting_current_docket FAILED tests/fundamentals/test_context_injection.py::test_supports_requesting_current_worker FAILED tests/fundamentals/test_context_injection.py::test_supports_requesting_current_execution FAILED tests/fundamentals/test_context_injection.py::test_supports_requesting_current_task_key FAILED tests/fundamentals/test_cron.py::test_cron_task_reschedules_itself - r... FAILED tests/fundamentals/test_cron.py::test_cron_tasks_are_automatically_scheduled FAILED tests/fundamentals/test_cron.py::test_cron_tasks_continue_after_errors FAILED tests/fundamentals/test_cron.py::test_cron_tasks_can_cancel_themselves FAILED tests/fundamentals/test_cron.py::test_cron_supports_vixie_keywords - r... FAILED tests/fundamentals/test_cron.py::test_automatic_cron_waits_for_scheduled_time FAILED tests/fundamentals/test_cron.py::test_cron_with_timezone - redis.excep... FAILED tests/fundamentals/test_errors.py::test_adding_task_with_unbindable_arguments FAILED tests/fundamentals/test_idempotency.py::test_adding_is_idempotent - re... FAILED tests/fundamentals/test_idempotency.py::test_task_keys_are_idempotent_in_the_future FAILED tests/fundamentals/test_idempotency.py::test_task_keys_are_idempotent_between_the_future_and_present FAILED tests/fundamentals/test_idempotency.py::test_task_keys_are_idempotent_in_the_present FAILED tests/fundamentals/test_logging.py::test_tasks_can_opt_into_argument_logging FAILED tests/fundamentals/test_logging.py::test_tasks_can_opt_into_logging_collection_lengths FAILED tests/fundamentals/test_logging.py::test_logging_inside_of_task - redi... FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks - redis.exc... FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_cancel_themselves FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_change_their_parameters FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks_perpetuate_even_after_errors FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_be_automatically_scheduled FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_schedule_next_run_after_delay FAILED tests/fundamentals/test_perpetual.py::test_cancelled_automatic_perpetual_can_be_rescheduled FAILED tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_schedule_next_run_at_specific_time FAILED tests/fundamentals/test_progress_state.py::test_tasks_can_report_progress FAILED tests/fundamentals/test_progress_state.py::test_tasks_can_access_execution_state FAILED tests/fundamentals/test_progress_state.py::test_execution_state_lifecycle FAILED tests/fundamentals/test_results.py::test_task_results_can_be_stored_and_retrieved FAILED tests/fundamentals/test_retries.py::test_errors_are_logged - redis.exc... FAILED tests/fundamentals/test_retries.py::test_supports_simple_linear_retries FAILED tests/fundamentals/test_retries.py::test_supports_simple_linear_retries_with_delay FAILED tests/fundamentals/test_retries.py::test_supports_infinite_retries - r... FAILED tests/fundamentals/test_retries.py::test_supports_exponential_backoff_retries FAILED tests/fundamentals/test_retries.py::test_supports_exponential_backoff_retries_under_maximum_delay FAILED tests/fundamentals/test_scheduling.py::test_immediate_task_execution FAILED tests/fundamentals/test_scheduling.py::test_immediate_task_execution_by_name FAILED tests/fundamentals/test_scheduling.py::test_scheduled_execution - redi... FAILED tests/fundamentals/test_scheduling.py::test_rescheduling_later - redis... FAILED tests/fundamentals/test_scheduling.py::test_rescheduling_earlier - red... FAILED tests/fundamentals/test_scheduling.py::test_rescheduling_by_name - red... FAILED tests/fundamentals/test_scheduling.py::test_replace_without_existing_task_acts_like_add FAILED tests/fundamentals/test_self_perpetuation.py::test_self_perpetuating_immediate_tasks FAILED tests/fundamentals/test_self_perpetuation.py::test_self_perpetuating_scheduled_tasks FAILED tests/fundamentals/test_self_perpetuation.py::test_infinitely_self_perpetuating_tasks FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_dependency_is_initialized_once FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_dependencies_are_same_instance FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_identity_is_factory_function FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_cleanup_on_worker_exit FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_depending_on_shared FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_depending_on_depends FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_can_access_current_docket_and_worker FAILED tests/fundamentals/test_shared_dependencies.py::test_late_registered_task_with_new_shared FAILED tests/fundamentals/test_shared_dependencies.py::test_multiple_shared_cleanup_order FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_cleanup_on_init_failure FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_async_function_factory FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_sync_function_factory FAILED tests/fundamentals/test_shared_dependencies.py::test_shared_sync_context_manager_factory FAILED tests/fundamentals/test_striking.py::test_striking_entire_tasks - redi... FAILED tests/fundamentals/test_striking.py::test_striking_entire_parameters FAILED tests/fundamentals/test_striking.py::test_striking_tasks_for_specific_parameters FAILED tests/fundamentals/test_sync_dependencies.py::test_sync_function_dependencies FAILED tests/fundamentals/test_sync_dependencies.py::test_sync_contextual_dependencies FAILED tests/fundamentals/test_sync_dependencies.py::test_mixed_sync_and_async_dependencies FAILED tests/fundamentals/test_sync_dependencies.py::test_sync_dependencies_of_dependencies FAILED tests/fundamentals/test_sync_dependencies.py::test_sync_dependencies_can_ask_for_docket_dependencies FAILED tests/fundamentals/test_sync_dependencies.py::test_mixed_sync_async_nested_dependencies FAILED tests/fundamentals/test_timeouts.py::test_simple_timeout - redis.excep... FAILED tests/fundamentals/test_timeouts.py::test_simple_timeout_cancels_tasks FAILED tests/fundamentals/test_timeouts.py::test_timeout_can_be_extended - re... FAILED tests/fundamentals/test_timeouts.py::test_timeout_extends_by_base_by_default FAILED tests/fundamentals/test_timeouts.py::test_timeout_is_compatible_with_retry FAILED tests/instrumentation/test_counters.py::test_adding_a_task_increments_counter FAILED tests/instrumentation/test_counters.py::test_replacing_a_task_increments_counter FAILED tests/instrumentation/test_counters.py::test_cancelling_a_task_increments_counter FAILED tests/instrumentation/test_counters.py::test_worker_execution_increments_task_counters FAILED tests/instrumentation/test_counters.py::test_failed_task_increments_failure_counter FAILED tests/instrumentation/test_counters.py::test_retried_task_increments_retry_counter FAILED tests/instrumentation/test_counters.py::test_exhausted_retried_task_increments_retry_counter FAILED tests/instrumentation/test_counters.py::test_retried_task_metric_uses_bounded_labels FAILED tests/instrumentation/test_counters.py::test_perpetuated_task_metric_uses_bounded_labels FAILED tests/instrumentation/test_counters.py::test_redelivered_tasks_increment_redelivered_counter FAILED tests/instrumentation/test_counters.py::test_superseded_task_increments_superseded_counter FAILED tests/instrumentation/test_counters.py::test_replaced_task_only_counts_replacement FAILED tests/instrumentation/test_export.py::test_task_duration_is_measured FAILED tests/instrumentation/test_export.py::test_task_punctuality_is_measured FAILED tests/instrumentation/test_export.py::test_task_running_gauge_is_incremented FAILED tests/instrumentation/test_export.py::test_worker_publishes_depth_gauges FAILED tests/test_agenda.py::test_agenda_scatter_basic - redis.exceptions.Res... FAILED tests/test_agenda.py::test_agenda_scatter_with_start_time - redis.exce... FAILED tests/test_agenda.py::test_agenda_scatter_with_jitter - redis.exceptio... FAILED tests/test_agenda.py::test_agenda_scatter_with_large_jitter - redis.ex... FAILED tests/test_agenda.py::test_agenda_scatter_single_task - redis.exceptio... FAILED tests/test_agenda.py::test_agenda_scatter_heterogeneous_tasks - redis.... FAILED tests/test_agenda.py::test_agenda_scatter_preserves_order - redis.exce... FAILED tests/test_agenda.py::test_agenda_reusability - redis.exceptions.Respo... FAILED tests/test_agenda.py::test_agenda_scatter_with_task_by_name - redis.ex... FAILED tests/test_agenda.py::test_agenda_scatter_partial_scheduling_behavior FAILED tests/test_agenda.py::test_agenda_scatter_auto_registers_unregistered_functions FAILED tests/test_cancellation.py::test_cancel_running_task - redis.exception... FAILED tests/test_cancellation.py::test_cancel_running_task_state - redis.exc... FAILED tests/test_cancellation.py::test_cancel_running_task_with_cleanup - re... FAILED tests/test_cancellation.py::test_cancel_task_that_ignores_cancellation FAILED tests/test_cancellation.py::test_cancel_already_completed_is_noop - re... FAILED tests/test_cancellation.py::test_cancel_publishes_state_event - redis.... FAILED tests/test_cancellation.py::test_cancel_only_affects_running_worker - ... FAILED tests/test_cancellation.py::test_cancel_running_task_with_zero_execution_ttl FAILED tests/test_cancellation.py::test_cancelled_task_with_retry_does_not_retry FAILED tests/test_cancellation.py::test_cancelled_perpetual_task_does_not_perpetuate FAILED tests/test_cancellation.py::test_cancel_running_task_with_timeout - re... FAILED tests/test_cancellation.py::test_get_result_raises_execution_cancelled_for_cancelled_task FAILED tests/test_dependencies_advanced.py::test_sync_function_dependency - r... FAILED tests/test_dependencies_advanced.py::test_sync_context_manager_dependency FAILED tests/test_dependencies_advanced.py::test_mixed_sync_and_async_dependencies FAILED tests/test_dependencies_advanced.py::test_nested_sync_dependencies - r... FAILED tests/test_dependencies_advanced.py::test_sync_dependency_with_docket_context FAILED tests/test_dependencies_advanced.py::test_sync_context_manager_cleanup_on_exception FAILED tests/test_dependencies_advanced.py::test_sync_dependency_caching - re... FAILED tests/test_dependencies_advanced.py::test_mixed_nested_dependencies - ... FAILED tests/test_dependencies_advanced.py::test_contextvar_isolation_between_tasks FAILED tests/test_dependencies_advanced.py::test_contextvar_cleanup_after_task FAILED tests/test_dependencies_advanced.py::test_dependency_cache_isolated_between_tasks FAILED tests/test_dependencies_advanced.py::test_async_exit_stack_cleanup - r... FAILED tests/test_dependencies_core.py::test_dependencies_may_be_duplicated FAILED tests/test_dependencies_core.py::test_users_can_provide_dependencies_directly FAILED tests/test_dependencies_core.py::test_user_provide_retries_are_used - ... FAILED tests/test_dependencies_core.py::test_user_can_request_a_retry_after_a_delay[Retry] FAILED tests/test_dependencies_core.py::test_user_can_request_a_retry_after_a_delay[ExponentialRetry] FAILED tests/test_dependencies_core.py::test_retry_in_is_backwards_compatible_alias_for_after FAILED tests/test_dependencies_core.py::test_user_can_request_a_retry_at_a_specific_time[Retry] FAILED tests/test_dependencies_core.py::test_user_can_request_a_retry_at_a_specific_time[ExponentialRetry] FAILED tests/test_dependencies_core.py::test_user_can_request_a_retry_at_a_specific_time_in_the_past FAILED tests/test_dependencies_core.py::test_dependencies_error_for_missing_task_argument FAILED tests/test_dependencies_core.py::test_a_task_argument_cannot_ask_for_itself FAILED tests/test_docket_clear.py::test_clear_with_immediate_tasks - redis.ex... FAILED tests/test_docket_clear.py::test_clear_with_scheduled_tasks - redis.ex... FAILED tests/test_docket_clear.py::test_clear_with_mixed_tasks - redis.except... FAILED tests/test_docket_clear.py::test_clear_with_parked_tasks - redis.excep... FAILED tests/test_docket_clear.py::test_clear_returns_total_count - redis.exc... FAILED tests/test_docket_clear.py::test_clear_no_redis_key_leaks - redis.exce... FAILED tests/test_docket_clear.py::test_clear_with_execution_ttl_zero - redis... FAILED tests/test_docket_clear.py::test_docket_without_worker_does_not_create_group FAILED tests/test_docket_clear.py::test_snapshot_handles_nogroup_with_real_redis[real] FAILED tests/test_docket_execution.py::test_docket_schedule_method_with_immediate_task FAILED tests/test_docket_execution.py::test_get_execution_for_scheduled_task FAILED tests/test_docket_execution.py::test_get_execution_for_queued_task - r... FAILED tests/test_docket_execution.py::test_get_execution_function_not_registered FAILED tests/test_docket_execution.py::test_get_execution_with_complex_args FAILED tests/test_docket_execution.py::test_get_execution_claim_check_pattern FAILED tests/test_docket_execution.py::test_cancelled_state_creates_tombstone FAILED tests/test_docket_execution.py::test_cancelled_state_respects_ttl - re... FAILED tests/test_docket_execution.py::test_cancelled_state_with_ttl_zero - r... FAILED tests/test_docket_execution.py::test_get_execution_after_cancel - redi... FAILED tests/test_docket_execution.py::test_replace_does_not_set_cancelled_state FAILED tests/test_docket_execution.py::test_cancellation_idempotent_with_tombstone FAILED tests/test_docket_registration.py::test_registered_task_usable_after_aenter FAILED tests/test_docket_registration.py::test_schedule_task_by_alias - redis... FAILED tests/test_execution_state.py::test_run_state_scheduled - redis.except... FAILED tests/test_execution_state.py::test_run_state_pending_to_running - red... FAILED tests/test_execution_state.py::test_run_state_completed_on_success - r... FAILED tests/test_execution_state.py::test_run_state_failed_on_exception - re... FAILED tests/test_execution_state.py::test_run_state_ttl_after_completion - r... FAILED tests/test_execution_state.py::test_custom_execution_ttl - redis.excep... FAILED tests/test_execution_state.py::test_full_lifecycle_integration - redis... FAILED tests/test_execution_state.py::test_run_add_returns_run_instance - red... FAILED tests/test_execution_state.py::test_error_message_stored_on_failure - ... FAILED tests/test_execution_state.py::test_mark_as_failed_without_error_message FAILED tests/test_fallback_task.py::test_default_fallback_task_logs_and_acks FAILED tests/test_fallback_task.py::test_custom_fallback_receives_original_args_kwargs FAILED tests/test_fallback_task.py::test_fallback_can_access_function_name - ... FAILED tests/test_fallback_task.py::test_fallback_dependency_injection - redi... FAILED tests/test_fallback_task.py::test_fallback_custom_user_dependency - re... FAILED tests/test_fallback_task.py::test_fallback_return_completes_task - red... FAILED tests/test_fallback_task.py::test_fallback_exception_triggers_retry - ... FAILED tests/test_fallback_task.py::test_execution_function_name_matches_for_known_tasks FAILED tests/test_handler_semantics.py::test_retrying_task_is_not_marked_as_failed FAILED tests/test_handler_semantics.py::test_exhausted_retries_marks_task_as_failed FAILED tests/test_handler_semantics.py::test_failed_perpetual_task_is_rescheduled FAILED tests/test_handler_semantics.py::test_retry_and_perpetual_work_together FAILED tests/test_handler_semantics.py::test_perpetual_after_is_respected_on_failure FAILED tests/test_key_leak_protection.py::test_leak_detection_catches_keys_without_ttl FAILED tests/test_key_leak_protection.py::test_permanent_keys_are_exempt - re... FAILED tests/test_key_leak_protection.py::test_exemption_mechanism - redis.ex... FAILED tests/test_key_leak_protection.py::test_multiple_exemptions - redis.ex... FAILED tests/test_key_leak_protection.py::test_worker_task_sets_are_exempt - ... FAILED tests/test_key_leak_protection.py::test_queue_is_cleaned_up - redis.ex... FAILED tests/test_memory_backend.py::test_docket_memory_backend - redis.excep... FAILED tests/test_memory_backend.py::test_multiple_memory_dockets - redis.exc... FAILED tests/test_memory_backend.py::test_memory_backend_reuses_server - redi... FAILED tests/test_memory_backend.py::test_different_memory_urls_are_isolated FAILED tests/test_memory_backend.py::test_memory_url_with_path_isolation - re... FAILED tests/test_perpetual_race.py::test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=60s] FAILED tests/test_perpetual_race.py::test_is_superseded_after_replace[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_is_superseded_after_replace[execution_ttl=60s] FAILED tests/test_perpetual_race.py::test_superseded_message_skipped_before_execution[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_superseded_message_skipped_before_execution[execution_ttl=60s] FAILED tests/test_perpetual_race.py::test_old_message_without_generation_runs_normally[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_old_message_without_generation_runs_normally[execution_ttl=60s] FAILED tests/test_perpetual_race.py::test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=60s] FAILED tests/test_perpetual_race.py::test_replace_skips_stale_stream_message[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_replace_skips_stale_stream_message[execution_ttl=60s] FAILED tests/test_perpetual_race.py::test_perpetual_successor_survives_mark_as_terminal[execution_ttl=0] FAILED tests/test_perpetual_race.py::test_perpetual_successor_survives_mark_as_terminal[execution_ttl=60s] FAILED tests/test_perpetual_state.py::test_perpetual_task_with_ttl_zero - red... FAILED tests/test_perpetual_state.py::test_perpetual_task_state_isolation - r... FAILED tests/test_perpetual_state.py::test_perpetual_task_no_state_accumulation_with_ttl_zero FAILED tests/test_perpetual_state.py::test_rapid_perpetual_tasks_no_conflicts FAILED tests/test_perpetual_state.py::test_perpetual_same_key_no_state_accumulation FAILED tests/test_perpetual_state.py::test_perpetual_task_state_transitions_with_same_key FAILED tests/test_perpetual_state.py::test_perpetual_publishes_completed_event[ttl_zero] FAILED tests/test_perpetual_state.py::test_perpetual_publishes_completed_event[default_ttl] FAILED tests/test_progress_basics.py::test_progress_dependency_injection - re... FAILED tests/test_progress_basics.py::test_progress_deleted_on_completion - r... FAILED tests/test_progress_basics.py::test_progress_with_multiple_increments FAILED tests/test_progress_basics.py::test_progress_without_total - redis.exc... FAILED tests/test_progress_pubsub.py::test_state_publish_events - redis.excep... FAILED tests/test_progress_pubsub.py::test_completed_state_publishes_event - ... FAILED tests/test_progress_pubsub.py::test_failed_state_publishes_event_with_error FAILED tests/test_progress_pubsub.py::test_end_to_end_progress_monitoring_with_worker FAILED tests/test_progress_pubsub.py::test_end_to_end_failed_task_monitoring FAILED tests/test_progress_pubsub.py::test_subscribing_to_completed_execution FAILED tests/test_redelivery.py::test_redelivery_from_abandoned_worker - redi... FAILED tests/test_redelivery.py::test_long_running_task_not_duplicated - redi... FAILED tests/test_redelivery.py::test_retry_with_long_running_task - redis.ex... FAILED tests/test_redelivery.py::test_multiple_workers_no_duplicate_execution FAILED tests/test_redelivery.py::test_perpetual_task_with_lease_renewal - red... FAILED tests/test_redelivery.py::test_user_timeout_longer_than_redelivery - r... FAILED tests/test_redelivery.py::test_workers_with_same_redelivery_timeout - ... FAILED tests/test_redelivery.py::test_worker_joining_doesnt_steal_renewed_lease FAILED tests/test_redelivery.py::test_lease_renewal_recovers_from_redis_error FAILED tests/test_results_retrieval.py::test_get_result_waits_for_completion FAILED tests/test_results_retrieval.py::test_get_result_timeout - redis.excep... FAILED tests/test_results_retrieval.py::test_multiple_concurrent_get_result_calls FAILED tests/test_results_retrieval.py::test_get_result_on_already_completed_task FAILED tests/test_results_retrieval.py::test_get_result_on_already_failed_task FAILED tests/test_results_retrieval.py::test_get_result_with_malformed_result_data FAILED tests/test_results_retrieval.py::test_get_result_failed_task_with_missing_exception_data FAILED tests/test_results_retrieval.py::test_get_result_with_timeout_timedelta FAILED tests/test_results_retrieval.py::test_get_result_with_deadline_datetime FAILED tests/test_results_retrieval.py::test_get_result_timeout_on_pending_task FAILED tests/test_results_storage.py::test_result_storage_for_int_return - re... FAILED tests/test_results_storage.py::test_result_storage_for_str_return - re... FAILED tests/test_results_storage.py::test_result_storage_for_dict_return - r... FAILED tests/test_results_storage.py::test_result_storage_for_object_return FAILED tests/test_results_storage.py::test_no_storage_for_none_annotated_task FAILED tests/test_results_storage.py::test_no_storage_for_runtime_none - redi... FAILED tests/test_results_storage.py::test_exception_storage_and_retrieval - ... FAILED tests/test_results_storage.py::test_result_key_stored_in_execution_record FAILED tests/test_striking.py::test_strike_incomparable_values[>-42-string] FAILED tests/test_striking.py::test_strike_incomparable_values[<-string-42] FAILED tests/test_striking.py::test_strike_incomparable_values[>=-None-42] - ... FAILED tests/test_striking.py::test_strike_incomparable_values[<=-42-None] - ... FAILED tests/test_striking.py::test_strike_incomparable_values[>-value4-42] FAILED tests/test_striking.py::test_strike_incomparable_values[<-42-test_value5] FAILED tests/test_striking.py::test_strike_incomparable_values[>=-value6-42] FAILED tests/test_striking.py::test_strike_incomparable_values[<=-42-test_value7] FAILED tests/test_striking.py::test_restored_automatic_perpetual_does_start FAILED tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_only FAILED tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_and_args FAILED tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_and_kwargs FAILED tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_args_and_kwargs FAILED tests/test_testing.py::test_assert_task_scheduled_finds_task_by_key - ... FAILED tests/test_testing.py::test_assert_task_scheduled_works_with_function_name FAILED tests/test_testing.py::test_assert_task_scheduled_succeeds_with_multiple_matching_tasks FAILED tests/test_testing.py::test_assert_task_scheduled_fails_when_task_not_found FAILED tests/test_testing.py::test_assert_task_scheduled_fails_when_args_dont_match FAILED tests/test_testing.py::test_assert_task_scheduled_fails_when_kwargs_dont_match FAILED tests/test_testing.py::test_assert_task_scheduled_finds_scheduled_future_task FAILED tests/test_testing.py::test_assert_task_not_scheduled_succeeds_when_different_task FAILED tests/test_testing.py::test_assert_task_not_scheduled_fails_when_task_exists FAILED tests/test_testing.py::test_assert_task_not_scheduled_with_specific_args FAILED tests/test_testing.py::test_assert_task_count_all_tasks - redis.except... FAILED tests/test_testing.py::test_assert_task_count_for_specific_function - ... FAILED tests/test_testing.py::test_assert_task_count_fails_with_wrong_count FAILED tests/test_testing.py::test_assert_task_count_with_function_name - red... FAILED tests/test_testing.py::test_assert_no_tasks_fails_when_tasks_present FAILED tests/test_testing.py::test_assert_no_tasks_after_tasks_complete - red... FAILED tests/test_testing.py::test_assert_task_scheduled_partial_kwargs_match FAILED tests/test_testing.py::test_assert_task_count_includes_future_and_immediate_tasks FAILED tests/test_testing.py::test_assert_task_scheduled_fails_when_key_doesnt_match FAILED tests/worker/test_bootstrap.py::test_redis_key_cleanup_successful_task FAILED tests/worker/test_bootstrap.py::test_redis_key_cleanup_failed_task - r... FAILED tests/worker/test_bootstrap.py::test_redis_key_cleanup_cancelled_task FAILED tests/worker/test_bootstrap.py::test_consumer_group_created_on_first_worker_read FAILED tests/worker/test_bootstrap.py::test_multiple_workers_racing_to_create_group FAILED tests/worker/test_bootstrap.py::test_worker_handles_nogroup_error_gracefully FAILED tests/worker/test_bootstrap.py::test_worker_handles_nogroup_in_xreadgroup FAILED tests/worker/test_core.py::test_worker_acknowledges_messages - redis.e... FAILED tests/worker/test_core.py::test_two_workers_split_work - redis.excepti... FAILED tests/worker/test_core.py::test_worker_reconnects_when_connection_is_lost FAILED tests/worker/test_core.py::test_worker_respects_concurrency_limit - re... FAILED tests/worker/test_core.py::test_worker_handles_unregistered_task_execution_on_initial_delivery FAILED tests/worker/test_core.py::test_worker_handles_unregistered_task_execution_on_redelivery FAILED tests/worker/test_core.py::test_worker_concurrency_cleanup_without_dependencies FAILED tests/worker/test_core.py::test_worker_concurrency_no_limit_with_custom_docket FAILED tests/worker/test_core.py::test_worker_exception_before_dependencies FAILED tests/worker/test_invariants.py::test_invariant_tasks_by_key_empty_after_completion FAILED tests/worker/test_invariants.py::test_invariant_tasks_by_key_no_growth_over_batches FAILED tests/worker/test_invariants.py::test_invariant_execution_counts_empty_after_completion FAILED tests/worker/test_invariants.py::test_invariant_execution_counts_cleared_after_run_at_most FAILED tests/worker/test_invariants.py::test_invariant_cleanup_after_task_exceptions FAILED tests/worker/test_invariants.py::test_invariant_cleanup_with_varied_tasks FAILED tests/worker/test_lifecycle.py::test_run_forever_cancels_promptly_with_future_tasks FAILED tests/worker/test_lifecycle.py::test_run_until_finished_exits_promptly_with_future_tasks FAILED tests/worker/test_lifecycle.py::test_run_at_most_cancels_promptly_with_future_tasks FAILED tests/worker/test_lifecycle.py::test_worker_drains_active_tasks_on_shutdown FAILED tests/worker/test_scheduling.py::test_perpetual_tasks_are_scheduled_close_to_target_time FAILED tests/worker/test_scheduling.py::test_worker_can_exit_from_perpetual_tasks_that_queue_further_tasks FAILED tests/worker/test_scheduling.py::test_worker_can_exit_from_long_horizon_perpetual_tasks FAILED tests/worker/test_scheduling.py::test_worker_timeout_exceeds_redelivery_timeout FAILED tests/worker/test_scheduling.py::test_replacement_race_condition_stream_tasks FAILED tests/worker/test_scheduling.py::test_replace_task_in_queue_before_stream FAILED tests/worker/test_scheduling.py::test_rapid_replace_operations - redis... FAILED tests/worker/test_scheduling.py::test_duplicate_execution_race_condition_non_perpetual_task[default_ttl] FAILED tests/worker/test_scheduling.py::test_duplicate_execution_race_condition_non_perpetual_task[zero_ttl] FAILED tests/worker/test_scheduling.py::test_wrongtype_error_with_legacy_known_task_key FAILED tests/worker/test_scheduling.py::test_replace_task_with_legacy_known_key FAILED tests/worker/test_ttl_zero.py::test_task_executes_with_ttl_zero - redi... FAILED tests/worker/test_ttl_zero.py::test_state_record_expires_immediately_with_ttl_zero FAILED tests/worker/test_ttl_zero.py::test_result_storage_with_ttl_zero - red... FAILED tests/worker/test_ttl_zero.py::test_failed_task_with_ttl_zero - redis.... FAILED tests/worker/test_ttl_zero.py::test_mixed_ttl_workload - redis.excepti... ERROR tests/cli/test_worker.py::test_rich_logging_format - AssertionError: Me... ERROR tests/cli/test_worker.py::test_plain_logging_format - AssertionError: M... ERROR tests/concurrency_limits/test_basic.py::test_basic_concurrency_limit - ... ERROR tests/concurrency_limits/test_basic.py::test_per_task_concurrency_limit ERROR tests/concurrency_limits/test_basic.py::test_concurrency_limit_single_argument ERROR tests/concurrency_limits/test_basic.py::test_concurrency_limit_different_arguments ERROR tests/concurrency_limits/test_basic.py::test_concurrency_limit_max_concurrent ERROR tests/concurrency_limits/test_basic.py::test_concurrency_limit_missing_argument_error ERROR tests/concurrency_limits/test_basic.py::test_concurrency_limit_with_custom_scope ERROR tests/concurrency_limits/test_basic.py::test_concurrency_limit_without_concurrency_dependency ERROR tests/concurrency_limits/test_basic.py::test_concurrency_keys_are_handled ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_with_task_failures ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_error_handling_during_execution ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_multiple_workers_coordination ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_refresh_handles_redis_errors ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_robustness_under_stress ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_concurrency_edge_cases ERROR tests/concurrency_limits/test_errors_and_resilience.py::test_worker_graceful_shutdown_with_concurrency_management ERROR tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_limits_task_queuing_behavior ERROR tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_different_customer_branches ERROR tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_limits_different_scopes ERROR tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_refresh_mechanism_integration ERROR tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_with_quick_tasks ERROR tests/concurrency_limits/test_execution_patterns.py::test_worker_concurrency_with_dependencies_integration ERROR tests/concurrency_limits/test_execution_patterns.py::test_concurrency_limited_task_successfully_acquires_slot ERROR tests/concurrency_limits/test_redelivery.py::test_task_timeout_with_explicit_timeout ERROR tests/concurrency_limits/test_redelivery.py::test_task_timeout_with_concurrent_tasks ERROR tests/concurrency_limits/test_redelivery.py::test_explicit_timeout_limits_long_tasks ERROR tests/concurrency_limits/test_redelivery.py::test_short_tasks_complete_within_timeout ERROR tests/concurrency_limits/test_redelivery.py::test_redeliveries_respect_concurrency_limits ERROR tests/concurrency_limits/test_redelivery.py::test_concurrency_blocked_task_executes_exactly_once ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_missing_argument_fails_task ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_no_limit_early_return ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_missing_argument_shows_available_args ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_cleanup_on_success ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_cleanup_on_failure ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_concurrency_cleanup_after_task_completion ERROR tests/concurrency_limits/test_worker_mechanics.py::test_worker_handles_concurrent_task_cleanup_gracefully ERROR tests/concurrency_limits/test_worker_mechanics.py::test_finally_block_releases_concurrency_on_success ERROR tests/concurrency_limits/test_worker_mechanics.py::test_stale_concurrency_slots_are_scavenged_when_full ERROR tests/concurrency_limits/test_worker_mechanics.py::test_graceful_shutdown_releases_concurrency_slots ERROR tests/fundamentals/test_async_dependencies.py::test_simple_function_dependencies ERROR tests/fundamentals/test_async_dependencies.py::test_contextual_dependencies ERROR tests/fundamentals/test_async_dependencies.py::test_dependencies_of_dependencies ERROR tests/fundamentals/test_async_dependencies.py::test_dependencies_can_ask_for_docket_dependencies ERROR tests/fundamentals/test_async_dependencies.py::test_dependency_failures_are_task_failures ERROR tests/fundamentals/test_async_dependencies.py::test_contextual_dependency_before_failures_are_task_failures ERROR tests/fundamentals/test_async_dependencies.py::test_contextual_dependency_after_failures_are_task_failures ERROR tests/fundamentals/test_async_dependencies.py::test_dependencies_can_ask_for_task_arguments ERROR tests/fundamentals/test_async_dependencies.py::test_task_arguments_may_be_optional ERROR tests/fundamentals/test_builtin_tasks.py::test_all_dockets_have_a_trace_task ERROR tests/fundamentals/test_builtin_tasks.py::test_all_dockets_have_a_fail_task ERROR tests/fundamentals/test_cancellation.py::test_cancelling_future_task - ... ERROR tests/fundamentals/test_cancellation.py::test_cancelling_immediate_task ERROR tests/fundamentals/test_cancellation.py::test_cancellation_is_idempotent ERROR tests/fundamentals/test_context_injection.py::test_supports_requesting_current_docket ERROR tests/fundamentals/test_context_injection.py::test_supports_requesting_current_worker ERROR tests/fundamentals/test_context_injection.py::test_supports_requesting_current_execution ERROR tests/fundamentals/test_context_injection.py::test_supports_requesting_current_task_key ERROR tests/fundamentals/test_cron.py::test_cron_task_reschedules_itself - As... ERROR tests/fundamentals/test_cron.py::test_cron_tasks_are_automatically_scheduled ERROR tests/fundamentals/test_cron.py::test_cron_tasks_continue_after_errors ERROR tests/fundamentals/test_cron.py::test_cron_tasks_can_cancel_themselves ERROR tests/fundamentals/test_cron.py::test_cron_supports_vixie_keywords - As... ERROR tests/fundamentals/test_cron.py::test_automatic_cron_waits_for_scheduled_time ERROR tests/fundamentals/test_cron.py::test_cron_with_timezone - AssertionErr... ERROR tests/fundamentals/test_errors.py::test_adding_task_with_unbindable_arguments ERROR tests/fundamentals/test_idempotency.py::test_adding_is_idempotent - Ass... ERROR tests/fundamentals/test_idempotency.py::test_task_keys_are_idempotent_in_the_future ERROR tests/fundamentals/test_idempotency.py::test_task_keys_are_idempotent_between_the_future_and_present ERROR tests/fundamentals/test_idempotency.py::test_task_keys_are_idempotent_in_the_present ERROR tests/fundamentals/test_logging.py::test_tasks_can_opt_into_argument_logging ERROR tests/fundamentals/test_logging.py::test_tasks_can_opt_into_logging_collection_lengths ERROR tests/fundamentals/test_logging.py::test_logging_inside_of_task - Asser... ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks - AssertionE... ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_cancel_themselves ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_change_their_parameters ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks_perpetuate_even_after_errors ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_be_automatically_scheduled ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_schedule_next_run_after_delay ERROR tests/fundamentals/test_perpetual.py::test_cancelled_automatic_perpetual_can_be_rescheduled ERROR tests/fundamentals/test_perpetual.py::test_perpetual_tasks_can_schedule_next_run_at_specific_time ERROR tests/fundamentals/test_progress_state.py::test_tasks_can_report_progress ERROR tests/fundamentals/test_progress_state.py::test_tasks_can_access_execution_state ERROR tests/fundamentals/test_progress_state.py::test_execution_state_lifecycle ERROR tests/fundamentals/test_results.py::test_task_results_can_be_stored_and_retrieved ERROR tests/fundamentals/test_retries.py::test_errors_are_logged - AssertionE... ERROR tests/fundamentals/test_retries.py::test_supports_simple_linear_retries ERROR tests/fundamentals/test_retries.py::test_supports_simple_linear_retries_with_delay ERROR tests/fundamentals/test_retries.py::test_supports_infinite_retries - As... ERROR tests/fundamentals/test_retries.py::test_supports_exponential_backoff_retries ERROR tests/fundamentals/test_retries.py::test_supports_exponential_backoff_retries_under_maximum_delay ERROR tests/fundamentals/test_scheduling.py::test_immediate_task_execution - ... ERROR tests/fundamentals/test_scheduling.py::test_immediate_task_execution_by_name ERROR tests/fundamentals/test_scheduling.py::test_scheduled_execution - Asser... ERROR tests/fundamentals/test_scheduling.py::test_rescheduling_later - Assert... ERROR tests/fundamentals/test_scheduling.py::test_rescheduling_earlier - Asse... ERROR tests/fundamentals/test_scheduling.py::test_rescheduling_by_name - Asse... ERROR tests/fundamentals/test_scheduling.py::test_replace_without_existing_task_acts_like_add ERROR tests/fundamentals/test_self_perpetuation.py::test_self_perpetuating_immediate_tasks ERROR tests/fundamentals/test_self_perpetuation.py::test_self_perpetuating_scheduled_tasks ERROR tests/fundamentals/test_self_perpetuation.py::test_infinitely_self_perpetuating_tasks ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_dependency_is_initialized_once ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_dependencies_are_same_instance ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_identity_is_factory_function ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_cleanup_on_worker_exit ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_depending_on_shared ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_depending_on_depends ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_can_access_current_docket_and_worker ERROR tests/fundamentals/test_shared_dependencies.py::test_late_registered_task_with_new_shared ERROR tests/fundamentals/test_shared_dependencies.py::test_multiple_shared_cleanup_order ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_cleanup_on_init_failure ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_async_function_factory ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_sync_function_factory ERROR tests/fundamentals/test_shared_dependencies.py::test_shared_sync_context_manager_factory ERROR tests/fundamentals/test_striking.py::test_striking_entire_tasks - Asser... ERROR tests/fundamentals/test_sync_dependencies.py::test_sync_function_dependencies ERROR tests/fundamentals/test_sync_dependencies.py::test_sync_contextual_dependencies ERROR tests/fundamentals/test_sync_dependencies.py::test_mixed_sync_and_async_dependencies ERROR tests/fundamentals/test_sync_dependencies.py::test_sync_dependencies_of_dependencies ERROR tests/fundamentals/test_sync_dependencies.py::test_sync_dependencies_can_ask_for_docket_dependencies ERROR tests/fundamentals/test_sync_dependencies.py::test_mixed_sync_async_nested_dependencies ERROR tests/fundamentals/test_timeouts.py::test_simple_timeout - AssertionErr... ERROR tests/fundamentals/test_timeouts.py::test_simple_timeout_cancels_tasks ERROR tests/fundamentals/test_timeouts.py::test_timeout_can_be_extended - Ass... ERROR tests/fundamentals/test_timeouts.py::test_timeout_extends_by_base_by_default ERROR tests/fundamentals/test_timeouts.py::test_timeout_is_compatible_with_retry ERROR tests/instrumentation/test_counters.py::test_adding_a_task_increments_counter ERROR tests/instrumentation/test_counters.py::test_replacing_a_task_increments_counter ERROR tests/instrumentation/test_counters.py::test_cancelling_a_task_increments_counter ERROR tests/instrumentation/test_counters.py::test_worker_execution_increments_task_counters ERROR tests/instrumentation/test_counters.py::test_failed_task_increments_failure_counter ERROR tests/instrumentation/test_counters.py::test_retried_task_increments_retry_counter ERROR tests/instrumentation/test_counters.py::test_exhausted_retried_task_increments_retry_counter ERROR tests/instrumentation/test_counters.py::test_retried_task_metric_uses_bounded_labels ERROR tests/instrumentation/test_counters.py::test_perpetuated_task_metric_uses_bounded_labels ERROR tests/instrumentation/test_counters.py::test_redelivered_tasks_increment_redelivered_counter ERROR tests/instrumentation/test_counters.py::test_superseded_task_increments_superseded_counter ERROR tests/instrumentation/test_counters.py::test_replaced_task_only_counts_replacement ERROR tests/instrumentation/test_export.py::test_task_duration_is_measured - ... ERROR tests/instrumentation/test_export.py::test_task_punctuality_is_measured ERROR tests/instrumentation/test_export.py::test_task_running_gauge_is_incremented ERROR tests/instrumentation/test_export.py::test_worker_publishes_depth_gauges ERROR tests/test_agenda.py::test_agenda_scatter_basic - AssertionError: Memor... ERROR tests/test_agenda.py::test_agenda_scatter_with_start_time - AssertionEr... ERROR tests/test_agenda.py::test_agenda_scatter_with_jitter - AssertionError:... ERROR tests/test_agenda.py::test_agenda_scatter_with_large_jitter - Assertion... ERROR tests/test_agenda.py::test_agenda_scatter_single_task - AssertionError:... ERROR tests/test_agenda.py::test_agenda_scatter_heterogeneous_tasks - Asserti... ERROR tests/test_agenda.py::test_agenda_scatter_preserves_order - AssertionEr... ERROR tests/test_agenda.py::test_agenda_reusability - AssertionError: Memory ... ERROR tests/test_agenda.py::test_agenda_scatter_with_task_by_name - Assertion... ERROR tests/test_agenda.py::test_agenda_scatter_partial_scheduling_behavior ERROR tests/test_agenda.py::test_agenda_scatter_auto_registers_unregistered_functions ERROR tests/test_cancellation.py::test_cancel_running_task - AssertionError: ... ERROR tests/test_cancellation.py::test_cancel_running_task_state - AssertionE... ERROR tests/test_cancellation.py::test_cancel_running_task_with_cleanup - Ass... ERROR tests/test_cancellation.py::test_cancel_task_that_ignores_cancellation ERROR tests/test_cancellation.py::test_cancel_already_completed_is_noop - Ass... ERROR tests/test_cancellation.py::test_cancel_publishes_state_event - Asserti... ERROR tests/test_cancellation.py::test_cancel_only_affects_running_worker - A... ERROR tests/test_cancellation.py::test_cancelled_task_with_retry_does_not_retry ERROR tests/test_cancellation.py::test_cancelled_perpetual_task_does_not_perpetuate ERROR tests/test_cancellation.py::test_cancel_running_task_with_timeout - Ass... ERROR tests/test_cancellation.py::test_get_result_raises_execution_cancelled_for_cancelled_task ERROR tests/test_dependencies_advanced.py::test_sync_function_dependency - As... ERROR tests/test_dependencies_advanced.py::test_sync_context_manager_dependency ERROR tests/test_dependencies_advanced.py::test_mixed_sync_and_async_dependencies ERROR tests/test_dependencies_advanced.py::test_nested_sync_dependencies - As... ERROR tests/test_dependencies_advanced.py::test_sync_dependency_with_docket_context ERROR tests/test_dependencies_advanced.py::test_sync_context_manager_cleanup_on_exception ERROR tests/test_dependencies_advanced.py::test_sync_dependency_caching - Ass... ERROR tests/test_dependencies_advanced.py::test_mixed_nested_dependencies - A... ERROR tests/test_dependencies_advanced.py::test_contextvar_isolation_between_tasks ERROR tests/test_dependencies_advanced.py::test_contextvar_cleanup_after_task ERROR tests/test_dependencies_advanced.py::test_dependency_cache_isolated_between_tasks ERROR tests/test_dependencies_advanced.py::test_async_exit_stack_cleanup - As... ERROR tests/test_dependencies_core.py::test_dependencies_may_be_duplicated - ... ERROR tests/test_dependencies_core.py::test_users_can_provide_dependencies_directly ERROR tests/test_dependencies_core.py::test_user_provide_retries_are_used - A... ERROR tests/test_dependencies_core.py::test_user_can_request_a_retry_after_a_delay[Retry] ERROR tests/test_dependencies_core.py::test_user_can_request_a_retry_after_a_delay[ExponentialRetry] ERROR tests/test_dependencies_core.py::test_retry_in_is_backwards_compatible_alias_for_after ERROR tests/test_dependencies_core.py::test_user_can_request_a_retry_at_a_specific_time[Retry] ERROR tests/test_dependencies_core.py::test_user_can_request_a_retry_at_a_specific_time[ExponentialRetry] ERROR tests/test_dependencies_core.py::test_user_can_request_a_retry_at_a_specific_time_in_the_past ERROR tests/test_dependencies_core.py::test_dependencies_error_for_missing_task_argument ERROR tests/test_dependencies_core.py::test_a_task_argument_cannot_ask_for_itself ERROR tests/test_docket_clear.py::test_clear_with_immediate_tasks - Assertion... ERROR tests/test_docket_clear.py::test_clear_with_scheduled_tasks - Assertion... ERROR tests/test_docket_clear.py::test_clear_with_mixed_tasks - AssertionErro... ERROR tests/test_docket_clear.py::test_clear_with_parked_tasks - AssertionErr... ERROR tests/test_docket_clear.py::test_clear_returns_total_count - AssertionE... ERROR tests/test_docket_clear.py::test_clear_no_redis_key_leaks - AssertionEr... ERROR tests/test_docket_execution.py::test_docket_schedule_method_with_immediate_task ERROR tests/test_docket_execution.py::test_get_execution_for_scheduled_task ERROR tests/test_docket_execution.py::test_get_execution_for_queued_task - As... ERROR tests/test_docket_execution.py::test_get_execution_function_not_registered ERROR tests/test_docket_execution.py::test_get_execution_with_complex_args - ... ERROR tests/test_docket_execution.py::test_get_execution_claim_check_pattern ERROR tests/test_docket_execution.py::test_cancelled_state_creates_tombstone ERROR tests/test_docket_execution.py::test_cancelled_state_respects_ttl - Ass... ERROR tests/test_docket_execution.py::test_get_execution_after_cancel - Asser... ERROR tests/test_docket_execution.py::test_replace_does_not_set_cancelled_state ERROR tests/test_docket_execution.py::test_cancellation_idempotent_with_tombstone ERROR tests/test_docket_registration.py::test_schedule_task_by_alias - Assert... ERROR tests/test_execution_state.py::test_run_state_scheduled - AssertionErro... ERROR tests/test_execution_state.py::test_run_state_pending_to_running - Asse... ERROR tests/test_execution_state.py::test_run_state_completed_on_success - As... ERROR tests/test_execution_state.py::test_run_state_failed_on_exception - Ass... ERROR tests/test_execution_state.py::test_run_state_ttl_after_completion - As... ERROR tests/test_execution_state.py::test_full_lifecycle_integration - Assert... ERROR tests/test_execution_state.py::test_run_add_returns_run_instance - Asse... ERROR tests/test_execution_state.py::test_error_message_stored_on_failure - A... ERROR tests/test_execution_state.py::test_mark_as_failed_without_error_message ERROR tests/test_fallback_task.py::test_default_fallback_task_logs_and_acks ERROR tests/test_fallback_task.py::test_custom_fallback_receives_original_args_kwargs ERROR tests/test_fallback_task.py::test_fallback_can_access_function_name - A... ERROR tests/test_fallback_task.py::test_fallback_dependency_injection - Asser... ERROR tests/test_fallback_task.py::test_fallback_custom_user_dependency - Ass... ERROR tests/test_fallback_task.py::test_fallback_return_completes_task - Asse... ERROR tests/test_fallback_task.py::test_fallback_exception_triggers_retry - A... ERROR tests/test_fallback_task.py::test_execution_function_name_matches_for_known_tasks ERROR tests/test_handler_semantics.py::test_retrying_task_is_not_marked_as_failed ERROR tests/test_handler_semantics.py::test_exhausted_retries_marks_task_as_failed ERROR tests/test_handler_semantics.py::test_failed_perpetual_task_is_rescheduled ERROR tests/test_handler_semantics.py::test_retry_and_perpetual_work_together ERROR tests/test_handler_semantics.py::test_perpetual_after_is_respected_on_failure ERROR tests/test_key_leak_protection.py::test_leak_detection_catches_keys_without_ttl ERROR tests/test_key_leak_protection.py::test_permanent_keys_are_exempt - Ass... ERROR tests/test_key_leak_protection.py::test_exemption_mechanism - Assertion... ERROR tests/test_key_leak_protection.py::test_multiple_exemptions - Assertion... ERROR tests/test_key_leak_protection.py::test_worker_task_sets_are_exempt - A... ERROR tests/test_key_leak_protection.py::test_queue_is_cleaned_up - Assertion... ERROR tests/test_perpetual_race.py::test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_stale_perpetual_on_complete_overwrites_correct_successor[execution_ttl=60s] ERROR tests/test_perpetual_race.py::test_is_superseded_after_replace[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_is_superseded_after_replace[execution_ttl=60s] ERROR tests/test_perpetual_race.py::test_superseded_message_skipped_before_execution[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_superseded_message_skipped_before_execution[execution_ttl=60s] ERROR tests/test_perpetual_race.py::test_old_message_without_generation_runs_normally[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_old_message_without_generation_runs_normally[execution_ttl=60s] ERROR tests/test_perpetual_race.py::test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_new_task_moved_by_old_scheduler_runs_normally[execution_ttl=60s] ERROR tests/test_perpetual_race.py::test_replace_skips_stale_stream_message[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_replace_skips_stale_stream_message[execution_ttl=60s] ERROR tests/test_perpetual_race.py::test_perpetual_successor_survives_mark_as_terminal[execution_ttl=0] ERROR tests/test_perpetual_race.py::test_perpetual_successor_survives_mark_as_terminal[execution_ttl=60s] ERROR tests/test_perpetual_state.py::test_perpetual_task_state_isolation - As... ERROR tests/test_perpetual_state.py::test_rapid_perpetual_tasks_no_conflicts ERROR tests/test_perpetual_state.py::test_perpetual_same_key_no_state_accumulation ERROR tests/test_perpetual_state.py::test_perpetual_task_state_transitions_with_same_key ERROR tests/test_progress_basics.py::test_progress_create - ExceptionGroup: e... ERROR tests/test_progress_basics.py::test_progress_set_total - ExceptionGroup... ERROR tests/test_progress_basics.py::test_progress_increment - ExceptionGroup... ERROR tests/test_progress_basics.py::test_progress_set_message - ExceptionGro... ERROR tests/test_progress_basics.py::test_progress_dependency_injection - Ass... ERROR tests/test_progress_basics.py::test_progress_deleted_on_completion - As... ERROR tests/test_progress_basics.py::test_progress_with_multiple_increments ERROR tests/test_progress_basics.py::test_progress_without_total - AssertionE... ERROR tests/test_progress_basics.py::test_concurrent_progress_updates - Excep... ERROR tests/test_progress_pubsub.py::test_progress_publish_events - Exception... ERROR tests/test_progress_pubsub.py::test_state_publish_events - AssertionErr... ERROR tests/test_progress_pubsub.py::test_run_subscribe_both_state_and_progress ERROR tests/test_progress_pubsub.py::test_completed_state_publishes_event - E... ERROR tests/test_progress_pubsub.py::test_failed_state_publishes_event_with_error ERROR tests/test_progress_pubsub.py::test_end_to_end_progress_monitoring_with_worker ERROR tests/test_progress_pubsub.py::test_end_to_end_failed_task_monitoring ERROR tests/test_progress_pubsub.py::test_subscribing_to_completed_execution ERROR tests/test_redelivery.py::test_redelivery_from_abandoned_worker - Asser... ERROR tests/test_redelivery.py::test_long_running_task_not_duplicated - Asser... ERROR tests/test_redelivery.py::test_retry_with_long_running_task - Assertion... ERROR tests/test_redelivery.py::test_multiple_workers_no_duplicate_execution ERROR tests/test_redelivery.py::test_perpetual_task_with_lease_renewal - Asse... ERROR tests/test_redelivery.py::test_user_timeout_longer_than_redelivery - As... ERROR tests/test_redelivery.py::test_workers_with_same_redelivery_timeout - A... ERROR tests/test_redelivery.py::test_worker_joining_doesnt_steal_renewed_lease ERROR tests/test_redelivery.py::test_lease_renewal_recovers_from_redis_error ERROR tests/test_results_retrieval.py::test_get_result_waits_for_completion ERROR tests/test_results_retrieval.py::test_get_result_timeout - AssertionErr... ERROR tests/test_results_retrieval.py::test_multiple_concurrent_get_result_calls ERROR tests/test_results_retrieval.py::test_get_result_on_already_completed_task ERROR tests/test_results_retrieval.py::test_get_result_on_already_failed_task ERROR tests/test_results_retrieval.py::test_get_result_with_malformed_result_data ERROR tests/test_results_retrieval.py::test_get_result_failed_task_with_missing_exception_data ERROR tests/test_results_retrieval.py::test_get_result_with_timeout_timedelta ERROR tests/test_results_retrieval.py::test_get_result_with_deadline_datetime ERROR tests/test_results_retrieval.py::test_get_result_timeout_on_pending_task ERROR tests/test_results_storage.py::test_result_storage_for_int_return - Ass... ERROR tests/test_results_storage.py::test_result_storage_for_str_return - Ass... ERROR tests/test_results_storage.py::test_result_storage_for_dict_return - As... ERROR tests/test_results_storage.py::test_result_storage_for_object_return - ... ERROR tests/test_results_storage.py::test_no_storage_for_none_annotated_task ERROR tests/test_results_storage.py::test_no_storage_for_runtime_none - Asser... ERROR tests/test_results_storage.py::test_exception_storage_and_retrieval - A... ERROR tests/test_results_storage.py::test_result_key_stored_in_execution_record ERROR tests/test_striking.py::test_strike_incomparable_values[>-42-string] - ... ERROR tests/test_striking.py::test_strike_incomparable_values[<-string-42] - ... ERROR tests/test_striking.py::test_strike_incomparable_values[>=-None-42] - A... ERROR tests/test_striking.py::test_strike_incomparable_values[<=-42-None] - A... ERROR tests/test_striking.py::test_strike_incomparable_values[>-value4-42] - ... ERROR tests/test_striking.py::test_strike_incomparable_values[<-42-test_value5] ERROR tests/test_striking.py::test_strike_incomparable_values[>=-value6-42] ERROR tests/test_striking.py::test_strike_incomparable_values[<=-42-test_value7] ERROR tests/test_striking.py::test_restored_automatic_perpetual_does_start - ... ERROR tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_only ERROR tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_and_args ERROR tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_and_kwargs ERROR tests/test_testing.py::test_assert_task_scheduled_finds_task_by_function_args_and_kwargs ERROR tests/test_testing.py::test_assert_task_scheduled_finds_task_by_key - A... ERROR tests/test_testing.py::test_assert_task_scheduled_works_with_function_name ERROR tests/test_testing.py::test_assert_task_scheduled_succeeds_with_multiple_matching_tasks ERROR tests/test_testing.py::test_assert_task_scheduled_fails_when_task_not_found ERROR tests/test_testing.py::test_assert_task_scheduled_fails_when_args_dont_match ERROR tests/test_testing.py::test_assert_task_scheduled_fails_when_kwargs_dont_match ERROR tests/test_testing.py::test_assert_task_scheduled_finds_scheduled_future_task ERROR tests/test_testing.py::test_assert_task_not_scheduled_succeeds_when_different_task ERROR tests/test_testing.py::test_assert_task_not_scheduled_fails_when_task_exists ERROR tests/test_testing.py::test_assert_task_not_scheduled_with_specific_args ERROR tests/test_testing.py::test_assert_task_count_all_tasks - AssertionErro... ERROR tests/test_testing.py::test_assert_task_count_for_specific_function - A... ERROR tests/test_testing.py::test_assert_task_count_fails_with_wrong_count - ... ERROR tests/test_testing.py::test_assert_task_count_with_function_name - Asse... ERROR tests/test_testing.py::test_assert_no_tasks_fails_when_tasks_present - ... ERROR tests/test_testing.py::test_assert_no_tasks_after_tasks_complete - Asse... ERROR tests/test_testing.py::test_assert_task_scheduled_partial_kwargs_match ERROR tests/test_testing.py::test_assert_task_count_includes_future_and_immediate_tasks ERROR tests/test_testing.py::test_assert_task_scheduled_fails_when_key_doesnt_match ERROR tests/worker/test_bootstrap.py::test_redis_key_cleanup_successful_task ERROR tests/worker/test_bootstrap.py::test_redis_key_cleanup_failed_task - As... ERROR tests/worker/test_bootstrap.py::test_redis_key_cleanup_cancelled_task ERROR tests/worker/test_core.py::test_worker_acknowledges_messages - Assertio... ERROR tests/worker/test_core.py::test_two_workers_split_work - AssertionError... ERROR tests/worker/test_core.py::test_worker_reconnects_when_connection_is_lost ERROR tests/worker/test_core.py::test_worker_respects_concurrency_limit - Ass... ERROR tests/worker/test_core.py::test_worker_handles_unregistered_task_execution_on_initial_delivery ERROR tests/worker/test_core.py::test_worker_handles_unregistered_task_execution_on_redelivery ERROR tests/worker/test_core.py::test_worker_concurrency_cleanup_without_dependencies ERROR tests/worker/test_core.py::test_worker_concurrency_no_limit_with_custom_docket ERROR tests/worker/test_core.py::test_worker_exception_before_dependencies - ... ERROR tests/worker/test_invariants.py::test_invariant_tasks_by_key_empty_after_completion ERROR tests/worker/test_invariants.py::test_invariant_tasks_by_key_no_growth_over_batches ERROR tests/worker/test_invariants.py::test_invariant_execution_counts_empty_after_completion ERROR tests/worker/test_invariants.py::test_invariant_execution_counts_cleared_after_run_at_most ERROR tests/worker/test_invariants.py::test_invariant_cleanup_after_task_exceptions ERROR tests/worker/test_invariants.py::test_invariant_cleanup_with_varied_tasks ERROR tests/worker/test_lifecycle.py::test_run_forever_cancels_promptly_with_future_tasks ERROR tests/worker/test_lifecycle.py::test_run_until_finished_exits_promptly_with_future_tasks ERROR tests/worker/test_lifecycle.py::test_run_at_most_cancels_promptly_with_future_tasks ERROR tests/worker/test_lifecycle.py::test_worker_drains_active_tasks_on_shutdown ERROR tests/worker/test_scheduling.py::test_perpetual_tasks_are_scheduled_close_to_target_time ERROR tests/worker/test_scheduling.py::test_worker_can_exit_from_perpetual_tasks_that_queue_further_tasks ERROR tests/worker/test_scheduling.py::test_worker_can_exit_from_long_horizon_perpetual_tasks ERROR tests/worker/test_scheduling.py::test_worker_timeout_exceeds_redelivery_timeout ERROR tests/worker/test_scheduling.py::test_replacement_race_condition_stream_tasks ERROR tests/worker/test_scheduling.py::test_replace_task_in_queue_before_stream ERROR tests/worker/test_scheduling.py::test_rapid_replace_operations - Assert... ERROR tests/worker/test_scheduling.py::test_wrongtype_error_with_legacy_known_task_key ERROR tests/worker/test_scheduling.py::test_replace_task_with_legacy_known_key = 381 failed, 229 passed, 76 skipped, 2 deselected, 359 errors in 251.12s (0:04:11) = RPM build errors: error: Bad exit status from /var/tmp/rpm-tmp.5ItPqM (%check) Bad exit status from /var/tmp/rpm-tmp.5ItPqM (%check) Finish: rpmbuild python-pydocket-0.17.9-2.fc45.src.rpm Finish: build phase for python-pydocket-0.17.9-2.fc45.src.rpm INFO: chroot_scan: 1 files copied to /var/lib/copr-rpmbuild/results/chroot_scan INFO: /var/lib/mock/fedora-rawhide-ppc64le-1773335321.895804/root/var/log/dnf5.log INFO: chroot_scan: creating tarball /var/lib/copr-rpmbuild/results/chroot_scan.tar.gz /bin/tar: Removing leading `/' from member names ERROR: Exception(/var/lib/copr-rpmbuild/results/python-pydocket-0.17.9-2.fc45.src.rpm) Config(fedora-rawhide-ppc64le) 4 minutes 43 seconds INFO: Results and/or logs in: /var/lib/copr-rpmbuild/results INFO: Cleaning up build root ('cleanup_on_failure=True') Start: clean chroot INFO: unmounting tmpfs. Finish: clean chroot ERROR: Command failed: # /usr/bin/systemd-nspawn -q -M bde710d1552f45a3b346e4ebac757aed -D /var/lib/mock/fedora-rawhide-ppc64le-1773335321.895804/root -a -u mockbuild --capability=cap_ipc_lock --capability=cap_ipc_lock --bind=/tmp/mock-resolv.43i_q9b3:/etc/resolv.conf --bind=/dev/btrfs-control --bind=/dev/mapper/control --bind=/dev/fuse --bind=/dev/loop-control --bind=/dev/loop0 --bind=/dev/loop1 --bind=/dev/loop2 --bind=/dev/loop3 --bind=/dev/loop4 --bind=/dev/loop5 --bind=/dev/loop6 --bind=/dev/loop7 --bind=/dev/loop8 --bind=/dev/loop9 --bind=/dev/loop10 --bind=/dev/loop11 --console=pipe --setenv=TERM=vt100 --setenv=SHELL=/bin/bash --setenv=HOME=/builddir --setenv=HOSTNAME=mock --setenv=PATH=/usr/bin:/bin:/usr/sbin:/sbin '--setenv=PROMPT_COMMAND=printf "\033]0;\007"' '--setenv=PS1= \s-\v\$ ' --setenv=LANG=C.UTF-8 --resolv-conf=off bash --login -c '/usr/bin/rpmbuild -ba --noprep --target ppc64le /builddir/build/originals/python-pydocket.spec' Copr build error: Build failed