Warning: Permanently added '2620:52:6:1161:dead:beef:cafe:c108' (ED25519) to the list of known hosts. You can reproduce this build on your computer by running: sudo dnf install copr-rpmbuild /usr/bin/copr-rpmbuild --verbose --drop-resultdir --task-url https://copr.fedorainfracloud.org/backend/get-build-task/10398108-fedora-rawhide-x86_64 --chroot fedora-rawhide-x86_64 Version: 1.8 PID: 16394 Logging PID: 16396 Task: {'allow_user_ssh': False, 'appstream': False, 'background': True, 'build_id': 10398108, 'buildroot_pkgs': [], 'chroot': 'fedora-rawhide-x86_64', 'enable_net': False, 'fedora_review': False, 'git_hash': '1f8405f5f7ed5d092d487202f76c110dfde601db', 'git_repo': 'https://copr-dist-git.fedorainfracloud.org/git/qulogic/matplotlib311/python-seaborn', 'isolation': 'default', 'memory_reqs': 2048, 'package_name': 'python-seaborn', 'package_version': '0.13.2-18', 'project_dirname': 'matplotlib311', 'project_name': 'matplotlib311', 'project_owner': 'qulogic', 'repo_priority': None, 'repos': [{'baseurl': 'https://download.copr.fedorainfracloud.org/results/qulogic/matplotlib311/fedora-rawhide-x86_64/', 'id': 'copr_base', 'name': 'Copr repository', 'priority': None}], 'sandbox': 'qulogic/matplotlib311--qulogic', 'source_json': {}, 'source_type': None, 'ssh_public_keys': None, 'storage': 1, 'submitter': 'qulogic', 'tags': [], 'task_id': '10398108-fedora-rawhide-x86_64', 'timeout': 115200, 'uses_devel_repo': False, 'with_opts': [], 'without_opts': []} Running: git clone https://copr-dist-git.fedorainfracloud.org/git/qulogic/matplotlib311/python-seaborn /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn --depth 500 --no-single-branch --recursive cmd: ['git', 'clone', 'https://copr-dist-git.fedorainfracloud.org/git/qulogic/matplotlib311/python-seaborn', '/var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn', '--depth', '500', '--no-single-branch', '--recursive'] cwd: . rc: 0 stdout: stderr: Cloning into '/var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn'... Running: git checkout 1f8405f5f7ed5d092d487202f76c110dfde601db -- cmd: ['git', 'checkout', '1f8405f5f7ed5d092d487202f76c110dfde601db', '--'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn rc: 0 stdout: stderr: Note: switching to '1f8405f5f7ed5d092d487202f76c110dfde601db'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by switching back to a branch. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -c with the switch command. Example: git switch -c Or undo this operation with: git switch - Turn off this advice by setting config variable advice.detachedHead to false HEAD is now at 1f8405f automatic import of python-seaborn Running: dist-git-client sources cmd: ['dist-git-client', 'sources'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn rc: 0 stdout: stderr: INFO: Reading stdout from command: git rev-parse --abbrev-ref HEAD INFO: Reading stdout from command: git rev-parse HEAD INFO: Reading sources specification file: sources INFO: Downloading seaborn-0.13.2.tar.gz INFO: Reading stdout from command: curl --help all INFO: Calling: curl -H Pragma: -H 'Accept-Encoding: identity' -o seaborn-0.13.2.tar.gz --location --connect-timeout 60 --retry 3 --retry-delay 10 --remote-time --show-error --fail --retry-all-errors https://copr-dist-git.fedorainfracloud.org/repo/pkgs/qulogic/matplotlib311/python-seaborn/seaborn-0.13.2.tar.gz/md5/04d6f5e15656c62895169e0dec1162e6/seaborn-0.13.2.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 1423k 100 1423k 0 0 10.8M 0 --:--:-- --:--:-- --:--:-- 10.8M INFO: Reading stdout from command: md5sum seaborn-0.13.2.tar.gz tail: /var/lib/copr-rpmbuild/main.log: file truncated Running (timeout=115200): unbuffer mock --spec /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn/python-seaborn.spec --sources /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn --resultdir /var/lib/copr-rpmbuild/results --uniqueext 1777362911.442062 -r /var/lib/copr-rpmbuild/results/configs/child.cfg INFO: mock.py version 6.7 starting (python version = 3.14.2, NVR = mock-6.7-1.fc43), args: /usr/libexec/mock/mock --spec /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn/python-seaborn.spec --sources /var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn --resultdir /var/lib/copr-rpmbuild/results --uniqueext 1777362911.442062 -r /var/lib/copr-rpmbuild/results/configs/child.cfg Start(bootstrap): init plugins INFO: tmpfs initialized INFO: selinux enabled INFO: chroot_scan: initialized INFO: compress_logs: initialized Finish(bootstrap): init plugins Start: init plugins INFO: tmpfs initialized INFO: selinux enabled INFO: chroot_scan: initialized INFO: compress_logs: initialized Finish: init plugins INFO: Signal handler active Start: run INFO: Start(/var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn/python-seaborn.spec) Config(fedora-rawhide-x86_64) Start: clean chroot Finish: clean chroot Mock Version: 6.7 INFO: Mock Version: 6.7 Start(bootstrap): chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-x86_64-bootstrap-1777362911.442062/root. INFO: calling preinit hooks INFO: enabled root cache INFO: enabled package manager cache Start(bootstrap): cleaning package manager metadata Finish(bootstrap): cleaning package manager metadata INFO: Guessed host environment type: unknown INFO: Using container image: registry.fedoraproject.org/fedora:rawhide INFO: Pulling image: registry.fedoraproject.org/fedora:rawhide INFO: Tagging container image as mock-bootstrap-06f8d343-1724-4ce7-9a41-3b7da39121d3 INFO: Checking that 748d4071be12267e057d10ce61110c533d26b98df7c02fe696d2a651c07d08ef image matches host's architecture INFO: Copy content of container 748d4071be12267e057d10ce61110c533d26b98df7c02fe696d2a651c07d08ef to /var/lib/mock/fedora-rawhide-x86_64-bootstrap-1777362911.442062/root INFO: mounting 748d4071be12267e057d10ce61110c533d26b98df7c02fe696d2a651c07d08ef with podman image mount INFO: image 748d4071be12267e057d10ce61110c533d26b98df7c02fe696d2a651c07d08ef as /var/lib/containers/storage/overlay/7dc74afbf5ab5c2959bd3f1685475d8b88b140d3d1a68070b80c7ec04c312179/merged INFO: umounting image 748d4071be12267e057d10ce61110c533d26b98df7c02fe696d2a651c07d08ef (/var/lib/containers/storage/overlay/7dc74afbf5ab5c2959bd3f1685475d8b88b140d3d1a68070b80c7ec04c312179/merged) with podman image umount INFO: Removing image mock-bootstrap-06f8d343-1724-4ce7-9a41-3b7da39121d3 INFO: Package manager dnf5 detected and used (fallback) INFO: Not updating bootstrap chroot, bootstrap_image_ready=True Start(bootstrap): creating root cache Finish(bootstrap): creating root cache Finish(bootstrap): chroot init Start: chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-x86_64-1777362911.442062/root. INFO: calling preinit hooks INFO: enabled root cache INFO: enabled package manager cache Start: cleaning package manager metadata Finish: cleaning package manager metadata INFO: enabled HW Info plugin INFO: Package manager dnf5 detected and used (direct choice) INFO: Buildroot is handled by package management downloaded with a bootstrap image: rpm-6.0.1-5.fc45.x86_64 rpm-sequoia-1.10.2-1.fc45.x86_64 dnf5-5.4.2.0-1.fc45.x86_64 dnf5-plugins-5.4.2.0-1.fc45.x86_64 Start: installing minimal buildroot with dnf5 Updating and loading repositories: fedora 100% | 65.7 KiB/s | 37.0 KiB | 00m01s Copr repository 100% | 3.3 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package Arch Version Repository Size Installing group/module packages: bash x86_64 0:5.3.9-3.fc44 fedora 8.5 MiB bzip2 x86_64 0:1.0.8-23.fc44 fedora 95.0 KiB coreutils x86_64 0:9.10-3.fc45 fedora 5.6 MiB cpio x86_64 0:2.15-9.fc44 fedora 1.1 MiB diffutils x86_64 0:3.12-5.fc44 fedora 1.6 MiB fedora-release-common noarch 0:45-0.5 fedora 4.2 KiB findutils x86_64 1:4.10.0-7.fc44 fedora 1.9 MiB gawk x86_64 0:5.4.0-2.fc45 fedora 2.2 MiB glibc-minimal-langpack x86_64 0:2.43.9000-12.fc45 fedora 0.0 B grep x86_64 0:3.12-3.fc44 fedora 1.0 MiB gzip x86_64 0:1.14-2.fc44 fedora 401.6 KiB info x86_64 0:7.3-1.fc45 fedora 372.4 KiB patch x86_64 0:2.8-4.fc44 fedora 226.6 KiB redhat-rpm-config noarch 0:344-1.fc45 fedora 183.7 KiB rpm-build x86_64 0:6.0.1-5.fc45 fedora 294.6 KiB sed x86_64 0:4.10-1.fc45 fedora 939.4 KiB shadow-utils x86_64 2:4.19.3-2.fc45 fedora 4.0 MiB tar x86_64 2:1.35-8.fc44 fedora 3.0 MiB unzip x86_64 0:6.0-69.fc44 fedora 445.8 KiB util-linux x86_64 0:2.42-7.fc45 fedora 3.6 MiB which x86_64 0:2.23-4.fc44 fedora 83.4 KiB xz x86_64 1:5.8.3-1.fc45 fedora 1.4 MiB Installing dependencies: R-srpm-macros noarch 0:1.3.7-1.fc45 fedora 3.5 KiB add-determinism x86_64 0:0.7.3-2.fc45 fedora 2.2 MiB alternatives x86_64 0:1.33-5.fc44 fedora 62.1 KiB ansible-srpm-macros noarch 0:1-20.1.fc44 fedora 35.7 KiB audit-libs x86_64 0:4.1.4-1.fc45 fedora 390.5 KiB binutils x86_64 0:2.46.50-7.fc45 fedora 28.2 MiB build-reproducibility-srpm-macros noarch 0:0.7.3-2.fc45 fedora 1.2 KiB bzip2-libs x86_64 0:1.0.8-23.fc44 fedora 80.5 KiB ca-certificates noarch 0:2025.2.80_v9.0.304-7.fc45 fedora 2.7 MiB cmake-srpm-macros noarch 0:4.3.0-1.fc45 fedora 524.0 B coreutils-common x86_64 0:9.10-3.fc45 fedora 10.7 MiB crypto-policies noarch 0:20251128-3.git19878fe.fc44 fedora 132.6 KiB curl x86_64 0:8.20.0~rc2-1.fc45 fedora 482.7 KiB cyrus-sasl-lib x86_64 0:2.1.28-35.fc44 fedora 2.3 MiB debugedit x86_64 0:5.3-2.fc45 fedora 220.8 KiB dwz x86_64 0:0.16-3.fc44 fedora 290.9 KiB ed x86_64 0:1.22.5-2.fc45 fedora 149.7 KiB efi-srpm-macros noarch 0:6-6.fc44 fedora 40.2 KiB elfutils x86_64 0:0.195-1.fc45 fedora 3.0 MiB elfutils-debuginfod-client x86_64 0:0.195-1.fc45 fedora 83.8 KiB elfutils-libelf x86_64 0:0.195-1.fc45 fedora 1.2 MiB elfutils-libs x86_64 0:0.195-1.fc45 fedora 715.3 KiB erlang-srpm-macros noarch 0:0.3.11-1.fc45 fedora 1.9 KiB fedora-gpg-keys noarch 0:45-0.1 fedora 133.4 KiB fedora-release noarch 0:45-0.5 fedora 0.0 B fedora-release-identity-basic noarch 0:45-0.5 fedora 664.0 B fedora-repos noarch 0:45-0.1 fedora 4.9 KiB fedora-repos-rawhide noarch 0:45-0.1 fedora 2.2 KiB file x86_64 0:5.47-2.fc45 fedora 101.2 KiB file-libs x86_64 0:5.47-2.fc45 fedora 12.2 MiB filesystem x86_64 0:3.18-56.fc45 fedora 112.0 B filesystem-srpm-macros noarch 0:3.18-56.fc45 fedora 38.2 KiB fonts-srpm-macros noarch 1:5.0.0-3.fc45 fedora 55.8 KiB forge-srpm-macros noarch 0:0.4.0-4.fc44 fedora 38.9 KiB fpc-srpm-macros noarch 0:1.3-16.fc44 fedora 144.0 B gap-srpm-macros noarch 0:2-2.fc44 fedora 2.1 KiB gdb-minimal x86_64 0:17.1-5.fc45 fedora 14.2 MiB gdbm-libs x86_64 1:1.23-11.fc44 fedora 129.6 KiB ghc-srpm-macros noarch 0:1.10-1.fc44 fedora 792.0 B glibc x86_64 0:2.43.9000-12.fc45 fedora 7.0 MiB glibc-common x86_64 0:2.43.9000-12.fc45 fedora 1.0 MiB glibc-gconv-extra x86_64 0:2.43.9000-12.fc45 fedora 7.7 MiB gmp x86_64 1:6.3.0-5.fc44 fedora 815.2 KiB gnat-srpm-macros noarch 0:7-2.fc44 fedora 1.0 KiB gnulib-l10n noarch 0:20241231-2.fc44 fedora 655.0 KiB gnupg2 x86_64 0:2.4.9-7.fc45 fedora 6.5 MiB gnupg2-dirmngr x86_64 0:2.4.9-7.fc45 fedora 634.0 KiB gnupg2-gpg-agent x86_64 0:2.4.9-7.fc45 fedora 686.6 KiB gnupg2-gpgconf x86_64 0:2.4.9-7.fc45 fedora 249.7 KiB gnupg2-keyboxd x86_64 0:2.4.9-7.fc45 fedora 201.2 KiB gnupg2-verify x86_64 0:2.4.9-7.fc45 fedora 360.3 KiB gnutls x86_64 0:3.8.12-1.fc45 fedora 3.7 MiB go-srpm-macros noarch 0:3.8.0-2.fc44 fedora 61.9 KiB gpgverify noarch 0:2.2-4.fc44 fedora 8.7 KiB ima-evm-utils-libs x86_64 0:1.6.2-10.fc45 fedora 60.6 KiB jansson x86_64 0:2.14-4.fc44 fedora 88.9 KiB java-srpm-macros noarch 0:1-8.fc44 fedora 870.0 B json-c x86_64 0:0.18-8.fc44 fedora 82.6 KiB kernel-srpm-macros noarch 0:1.0-29.fc45 fedora 1.9 KiB keyutils-libs x86_64 0:1.6.3-7.fc44 fedora 54.2 KiB krb5-libs x86_64 0:1.22.2-6.fc45 fedora 2.4 MiB libacl x86_64 0:2.3.2-6.fc44 fedora 35.8 KiB libarchive x86_64 0:3.8.7-1.fc45 fedora 1.0 MiB libassuan x86_64 0:2.5.7-5.fc44 fedora 163.8 KiB libattr x86_64 0:2.5.2-8.fc44 fedora 24.3 KiB libblkid x86_64 0:2.42-7.fc45 fedora 282.2 KiB libbrotli x86_64 0:1.2.0-3.fc44 fedora 865.0 KiB libcap x86_64 0:2.78-1.fc45 fedora 212.1 KiB libcap-ng x86_64 0:0.9.3-1.fc45 fedora 68.8 KiB libcbor x86_64 0:0.13.0-2.fc44 fedora 79.5 KiB libcom_err x86_64 0:1.47.4-1.fc45 fedora 63.0 KiB libcurl x86_64 0:8.20.0~rc2-1.fc45 fedora 1.0 MiB libeconf x86_64 0:0.7.9-3.fc44 fedora 64.8 KiB libevent x86_64 0:2.1.12-17.fc44 fedora 978.7 KiB libfdisk x86_64 0:2.42-7.fc45 fedora 388.2 KiB libffi x86_64 0:3.5.2-2.fc44 fedora 87.7 KiB libfido2 x86_64 0:1.17.0-1.fc45 fedora 260.5 KiB libfsverity x86_64 0:1.7-1.fc45 fedora 28.4 KiB libgcc x86_64 0:16.0.1-0.11.fc45 fedora 270.6 KiB libgcrypt x86_64 0:1.12.2-1.fc45 fedora 1.7 MiB libgomp x86_64 0:16.0.1-0.11.fc45 fedora 581.4 KiB libgpg-error x86_64 0:1.60-1.fc45 fedora 948.7 KiB libidn2 x86_64 0:2.3.8-3.fc44 fedora 556.4 KiB libksba x86_64 0:1.6.8-1.fc45 fedora 421.9 KiB liblastlog2 x86_64 0:2.42-7.fc45 fedora 41.6 KiB libmount x86_64 0:2.42-7.fc45 fedora 400.8 KiB libnghttp2 x86_64 0:1.69.0-1.fc45 fedora 166.1 KiB libnghttp3 x86_64 0:1.15.0-1.fc44 fedora 159.2 KiB libpkgconf x86_64 0:2.5.1-1.fc45 fedora 90.1 KiB libpsl x86_64 0:0.21.5-7.fc44 fedora 76.3 KiB libselinux x86_64 0:3.10-1.fc44 fedora 201.0 KiB libselinux-utils x86_64 0:3.10-1.fc44 fedora 305.7 KiB libsemanage x86_64 0:3.10-1.fc44 fedora 312.3 KiB libsepol x86_64 0:3.10-1.fc44 fedora 870.0 KiB libsmartcols x86_64 0:2.42-7.fc45 fedora 192.4 KiB libssh x86_64 0:0.12.0-1.fc45 fedora 719.1 KiB libssh-config noarch 0:0.12.0-1.fc45 fedora 277.0 B libstdc++ x86_64 0:16.0.1-0.11.fc45 fedora 3.1 MiB libtasn1 x86_64 0:4.21.0-1.fc45 fedora 180.6 KiB libtool-ltdl x86_64 0:2.5.4-10.fc44 fedora 70.0 KiB libunistring x86_64 0:1.1-11.fc44 fedora 1.7 MiB libusb1 x86_64 0:1.0.29-5.fc44 fedora 175.2 KiB libuuid x86_64 0:2.42-7.fc45 fedora 37.2 KiB libverto x86_64 0:0.3.2-12.fc44 fedora 25.3 KiB libxcrypt x86_64 0:4.5.2-3.fc44 fedora 293.2 KiB libxml2 x86_64 0:2.12.10-6.fc44 fedora 1.8 MiB libzstd x86_64 0:1.5.7-5.fc44 fedora 956.1 KiB linkdupes x86_64 0:0.7.3-2.fc45 fedora 780.9 KiB lua-libs x86_64 0:5.5.0-1.fc45 fedora 297.9 KiB lua-srpm-macros noarch 0:1-17.fc44 fedora 1.3 KiB lz4-libs x86_64 0:1.10.0-4.fc44 fedora 157.3 KiB mpfr x86_64 0:4.2.2-3.fc44 fedora 849.1 KiB ncurses-base noarch 0:6.6-1.fc44 fedora 329.7 KiB ncurses-libs x86_64 0:6.6-1.fc44 fedora 968.9 KiB nettle x86_64 0:3.10.1-3.fc44 fedora 794.3 KiB ngtcp2 x86_64 0:1.22.1-1.fc45 fedora 338.2 KiB ngtcp2-crypto-ossl x86_64 0:1.22.1-1.fc45 fedora 51.6 KiB npth x86_64 0:1.8-4.fc44 fedora 49.5 KiB ocaml-srpm-macros noarch 0:11-3.fc44 fedora 1.9 KiB openblas-srpm-macros noarch 0:2-21.fc44 fedora 112.0 B openldap x86_64 0:2.6.13-1.fc45 fedora 669.9 KiB openssl-libs x86_64 1:3.5.6-1.fc45 fedora 9.2 MiB p11-kit x86_64 0:0.26.2-1.fc45 fedora 2.6 MiB p11-kit-trust x86_64 0:0.26.2-1.fc45 fedora 478.3 KiB package-notes-srpm-macros noarch 0:0.17-3.fc45 fedora 1.6 KiB pam-libs x86_64 0:1.7.2-1.fc44 fedora 130.5 KiB pcre2 x86_64 0:10.47-1.fc44.1 fedora 718.6 KiB pcre2-syntax noarch 0:10.47-1.fc44.1 fedora 281.9 KiB perl-srpm-macros noarch 0:1-61.fc44 fedora 861.0 B pkgconf x86_64 0:2.5.1-1.fc45 fedora 92.7 KiB pkgconf-m4 noarch 0:2.5.1-1.fc45 fedora 14.3 KiB pkgconf-pkg-config x86_64 0:2.5.1-1.fc45 fedora 989.0 B policycoreutils x86_64 0:3.10-3.fc45 fedora 884.5 KiB popt x86_64 0:1.19-10.fc44 fedora 132.6 KiB publicsuffix-list-dafsa noarch 0:20260116-1.fc44 fedora 70.4 KiB pyproject-srpm-macros noarch 0:1.20.0-1.fc45 fedora 1.9 KiB python-srpm-macros noarch 0:3.14-12.fc45 fedora 51.6 KiB qt5-srpm-macros noarch 0:5.15.18-2.fc44 fedora 500.0 B qt6-srpm-macros noarch 0:6.11.0-1.fc45 fedora 472.0 B readline x86_64 0:8.3-4.fc44 fedora 519.5 KiB redhat-systemd-presets noarch 0:102-1.fc45 fedora 1.0 KiB redhat-systemd-presets-common noarch 0:102-1.fc45 fedora 16.6 KiB rpm x86_64 0:6.0.1-5.fc45 fedora 3.1 MiB rpm-build-libs x86_64 0:6.0.1-5.fc45 fedora 276.3 KiB rpm-libs x86_64 0:6.0.1-5.fc45 fedora 961.2 KiB rpm-plugin-selinux x86_64 0:6.0.1-5.fc45 fedora 11.9 KiB rpm-sequoia x86_64 0:1.10.2-1.fc45 fedora 2.4 MiB rpm-sign-libs x86_64 0:6.0.1-5.fc45 fedora 39.6 KiB rust-srpm-macros noarch 0:28.4-3.fc44 fedora 5.5 KiB selinux-policy noarch 0:45.1-1.fc45 fedora 32.0 KiB selinux-policy-targeted noarch 0:45.1-1.fc45 fedora 18.6 MiB setup noarch 0:2.15.0-29.fc45 fedora 724.9 KiB sqlite-libs x86_64 0:3.52.0-1.fc45 fedora 1.6 MiB systemd-libs x86_64 0:260.1-2.fc45 fedora 2.5 MiB systemd-standalone-sysusers x86_64 0:260.1-2.fc45 fedora 841.2 KiB tpm2-tss x86_64 0:4.1.3-9.fc44 fedora 1.6 MiB tree-sitter-srpm-macros noarch 0:0.4.2-2.fc44 fedora 8.3 KiB util-linux-core x86_64 0:2.42-7.fc45 fedora 1.5 MiB xxhash-libs x86_64 0:0.8.3-4.fc44 fedora 94.0 KiB xz-libs x86_64 1:5.8.3-1.fc45 fedora 217.7 KiB zig-srpm-macros noarch 0:1-8.fc44 fedora 1.3 KiB zip x86_64 0:3.0-45.fc44 fedora 698.0 KiB zlib-ng-compat x86_64 0:2.3.3-5.fc45 fedora 165.6 KiB zstd x86_64 0:1.5.7-5.fc44 fedora 502.4 KiB Installing groups: Buildsystem building group Transaction Summary: Installing: 186 packages Total size of inbound packages is 70 MiB. Need to download 0 B. After this operation, 226 MiB extra will be used (install 226 MiB, remove 0 B). [ 1/186] tar-2:1.35-8.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 2/186] bzip2-0:1.0.8-23.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 3/186] redhat-rpm-config-0:344-1.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 4/186] rpm-build-0:6.0.1-5.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 5/186] unzip-0:6.0-69.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 6/186] cpio-0:2.15-9.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 7/186] which-0:2.23-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 8/186] bash-0:5.3.9-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 9/186] coreutils-0:9.10-3.fc45.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 10/186] grep-0:3.12-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 11/186] patch-0:2.8-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 12/186] sed-0:4.10-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 13/186] shadow-utils-2:4.19.3-2.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 14/186] diffutils-0:3.12-5.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 15/186] fedora-release-common-0:45-0. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 16/186] findutils-1:4.10.0-7.fc44.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 17/186] glibc-minimal-langpack-0:2.43 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 18/186] gzip-0:1.14-2.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 19/186] info-0:7.3-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 20/186] xz-1:5.8.3-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 21/186] util-linux-0:2.42-7.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 22/186] gawk-0:5.4.0-2.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 23/186] glibc-0:2.43.9000-12.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 24/186] libacl-0:2.3.2-6.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 25/186] libselinux-0:3.10-1.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 26/186] bzip2-libs-0:1.0.8-23.fc44.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 27/186] R-srpm-macros-0:1.3.7-1.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 28/186] ansible-srpm-macros-0:1-20.1. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 29/186] build-reproducibility-srpm-ma 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 30/186] cmake-srpm-macros-0:4.3.0-1.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 31/186] dwz-0:0.16-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 32/186] efi-srpm-macros-0:6-6.fc44.no 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 33/186] erlang-srpm-macros-0:0.3.11-1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 34/186] file-0:5.47-2.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 35/186] filesystem-srpm-macros-0:3.18 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 36/186] fonts-srpm-macros-1:5.0.0-3.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 37/186] forge-srpm-macros-0:0.4.0-4.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 38/186] fpc-srpm-macros-0:1.3-16.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 39/186] gap-srpm-macros-0:2-2.fc44.no 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 40/186] ghc-srpm-macros-0:1.10-1.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 41/186] gnat-srpm-macros-0:7-2.fc44.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 42/186] go-srpm-macros-0:3.8.0-2.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 43/186] java-srpm-macros-0:1-8.fc44.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 44/186] kernel-srpm-macros-0:1.0-29.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 45/186] lua-srpm-macros-0:1-17.fc44.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 46/186] ocaml-srpm-macros-0:11-3.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 47/186] openblas-srpm-macros-0:2-21.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 48/186] package-notes-srpm-macros-0:0 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 49/186] perl-srpm-macros-0:1-61.fc44. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 50/186] pyproject-srpm-macros-0:1.20. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 51/186] python-srpm-macros-0:3.14-12. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 52/186] qt5-srpm-macros-0:5.15.18-2.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 53/186] qt6-srpm-macros-0:6.11.0-1.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 54/186] rpm-0:6.0.1-5.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 55/186] rust-srpm-macros-0:28.4-3.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 56/186] tree-sitter-srpm-macros-0:0.4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 57/186] zig-srpm-macros-0:1-8.fc44.no 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 58/186] zip-0:3.0-45.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 59/186] debugedit-0:5.3-2.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 60/186] elfutils-0:0.195-1.fc45.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 61/186] elfutils-libelf-0:0.195-1.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 62/186] libarchive-0:3.8.7-1.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 63/186] libgcc-0:16.0.1-0.11.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 64/186] libstdc++-0:16.0.1-0.11.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 65/186] popt-0:1.19-10.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 66/186] readline-0:8.3-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 67/186] rpm-build-libs-0:6.0.1-5.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 68/186] rpm-libs-0:6.0.1-5.fc45.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 69/186] zstd-0:1.5.7-5.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 70/186] filesystem-0:3.18-56.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 71/186] ncurses-libs-0:6.6-1.fc44.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 72/186] coreutils-common-0:9.10-3.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 73/186] gmp-1:6.3.0-5.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 74/186] libattr-0:2.5.2-8.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 75/186] libcap-0:2.78-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 76/186] openssl-libs-1:3.5.6-1.fc45.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 77/186] systemd-libs-0:260.1-2.fc45.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 78/186] pcre2-0:10.47-1.fc44.1.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 79/186] ed-0:1.22.5-2.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 80/186] audit-libs-0:4.1.4-1.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 81/186] libeconf-0:0.7.9-3.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 82/186] libsemanage-0:3.10-1.fc44.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 83/186] libxcrypt-0:4.5.2-3.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 84/186] pam-libs-0:1.7.2-1.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 85/186] setup-0:2.15.0-29.fc45.noarch 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 86/186] fedora-repos-0:45-0.1.noarch 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 87/186] redhat-systemd-presets-0:102- 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 88/186] glibc-common-0:2.43.9000-12.f 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 89/186] xz-libs-1:5.8.3-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 90/186] libblkid-0:2.42-7.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 91/186] libcap-ng-0:0.9.3-1.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 92/186] libfdisk-0:2.42-7.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 93/186] liblastlog2-0:2.42-7.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 94/186] libmount-0:2.42-7.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 95/186] libsmartcols-0:2.42-7.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 96/186] libuuid-0:2.42-7.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 97/186] util-linux-core-0:2.42-7.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 98/186] zlib-ng-compat-0:2.3.3-5.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 99/186] mpfr-0:4.2.2-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [100/186] glibc-gconv-extra-0:2.43.9000 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [101/186] libsepol-0:3.10-1.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [102/186] add-determinism-0:0.7.3-2.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [103/186] linkdupes-0:0.7.3-2.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [104/186] file-libs-0:5.47-2.fc45.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [105/186] curl-0:8.20.0~rc2-1.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [106/186] elfutils-libs-0:0.195-1.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [107/186] elfutils-debuginfod-client-0: 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [108/186] libzstd-0:1.5.7-5.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [109/186] libxml2-0:2.12.10-6.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [110/186] lz4-libs-0:1.10.0-4.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [111/186] libgomp-0:16.0.1-0.11.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [112/186] lua-libs-0:5.5.0-1.fc45.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [113/186] rpm-sign-libs-0:6.0.1-5.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [114/186] rpm-sequoia-0:1.10.2-1.fc45.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [115/186] sqlite-libs-0:3.52.0-1.fc45.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [116/186] ncurses-base-0:6.6-1.fc44.noa 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [117/186] gnulib-l10n-0:20241231-2.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [118/186] ca-certificates-0:2025.2.80_v 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [119/186] crypto-policies-0:20251128-3. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [120/186] pcre2-syntax-0:10.47-1.fc44.1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [121/186] fedora-gpg-keys-0:45-0.1.noar 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [122/186] fedora-repos-rawhide-0:45-0.1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [123/186] redhat-systemd-presets-common 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [124/186] json-c-0:0.18-8.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [125/186] gnupg2-0:2.4.9-7.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [126/186] ima-evm-utils-libs-0:1.6.2-10 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [127/186] libfsverity-0:1.7-1.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [128/186] gpgverify-0:2.2-4.fc44.noarch 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [129/186] gnupg2-dirmngr-0:2.4.9-7.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [130/186] gnupg2-gpg-agent-0:2.4.9-7.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [131/186] gnupg2-gpgconf-0:2.4.9-7.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [132/186] gnupg2-keyboxd-0:2.4.9-7.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [133/186] gnupg2-verify-0:2.4.9-7.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [134/186] libassuan-0:2.5.7-5.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [135/186] libgcrypt-0:1.12.2-1.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [136/186] libgpg-error-0:1.60-1.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [137/186] npth-0:1.8-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [138/186] tpm2-tss-0:4.1.3-9.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [139/186] gnutls-0:3.8.12-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [140/186] libksba-0:1.6.8-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [141/186] openldap-0:2.6.13-1.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [142/186] libusb1-0:1.0.29-5.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [143/186] libidn2-0:2.3.8-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [144/186] libtasn1-0:4.21.0-1.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [145/186] libunistring-0:1.1-11.fc44.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [146/186] nettle-0:3.10.1-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [147/186] p11-kit-0:0.26.2-1.fc45.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [148/186] cyrus-sasl-lib-0:2.1.28-35.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [149/186] libevent-0:2.1.12-17.fc44.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [150/186] libtool-ltdl-0:2.5.4-10.fc44. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [151/186] libffi-0:3.5.2-2.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [152/186] gdbm-libs-1:1.23-11.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [153/186] binutils-0:2.46.50-7.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [154/186] alternatives-0:1.33-5.fc44.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [155/186] jansson-0:2.14-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [156/186] pkgconf-pkg-config-0:2.5.1-1. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [157/186] pkgconf-0:2.5.1-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [158/186] pkgconf-m4-0:2.5.1-1.fc45.noa 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [159/186] libpkgconf-0:2.5.1-1.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [160/186] p11-kit-trust-0:0.26.2-1.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [161/186] fedora-release-0:45-0.5.noarc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [162/186] systemd-standalone-sysusers-0 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [163/186] gdb-minimal-0:17.1-5.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [164/186] xxhash-libs-0:0.8.3-4.fc44.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [165/186] fedora-release-identity-basic 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [166/186] libcurl-0:8.20.0~rc2-1.fc45.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [167/186] krb5-libs-0:1.22.2-6.fc45.x86 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [168/186] libbrotli-0:1.2.0-3.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [169/186] libnghttp2-0:1.69.0-1.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [170/186] libnghttp3-0:1.15.0-1.fc44.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [171/186] libpsl-0:0.21.5-7.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [172/186] libssh-0:0.12.0-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [173/186] ngtcp2-0:1.22.1-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [174/186] ngtcp2-crypto-ossl-0:1.22.1-1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [175/186] keyutils-libs-0:1.6.3-7.fc44. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [176/186] libcom_err-0:1.47.4-1.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [177/186] libverto-0:0.3.2-12.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [178/186] publicsuffix-list-dafsa-0:202 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [179/186] libfido2-0:1.17.0-1.fc45.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [180/186] libssh-config-0:0.12.0-1.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [181/186] libcbor-0:0.13.0-2.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [182/186] selinux-policy-targeted-0:45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [183/186] policycoreutils-0:3.10-3.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [184/186] selinux-policy-0:45.1-1.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [185/186] libselinux-utils-0:3.10-1.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [186/186] rpm-plugin-selinux-0:6.0.1-5. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded -------------------------------------------------------------------------------- [186/186] Total 100% | 0.0 B/s | 0.0 B | 00m00s Running transaction Importing OpenPGP key 0xF577861E: UserID : "Fedora (45) " Fingerprint: 4F50A6114CD5C6976A7F1179655A4B02F577861E From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-45-primary The key was successfully imported. Importing OpenPGP key 0xF577861E: UserID : "Fedora (45) " Fingerprint: 4F50A6114CD5C6976A7F1179655A4B02F577861E From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-45-primary The key was successfully imported. Importing OpenPGP key 0x6D9F90A6: UserID : "Fedora (44) " Fingerprint: 36F612DCF27F7D1A48A835E4DBFCF71C6D9F90A6 From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-44-primary The key was successfully imported. Importing OpenPGP key 0x91211FCE: UserID : "Fedora (46) " Fingerprint: D924B10D3E810DABDD8B56B596E7E91491211FCE From : file:///usr/share/distribution-gpg-keys/fedora/RPM-GPG-KEY-fedora-46-primary The key was successfully imported. [ 1/188] Verify package files 100% | 641.0 B/s | 186.0 B | 00m00s [ 2/188] Prepare transaction 100% | 2.2 KiB/s | 186.0 B | 00m00s [ 3/188] Installing libgcc-0:16.0.1-0. 100% | 133.0 MiB/s | 272.3 KiB | 00m00s [ 4/188] Installing redhat-systemd-pre 100% | 0.0 B/s | 1.5 KiB | 00m00s [ 5/188] Installing redhat-systemd-pre 100% | 0.0 B/s | 17.8 KiB | 00m00s [ 6/188] Installing libssh-config-0:0. 100% | 0.0 B/s | 816.0 B | 00m00s [ 7/188] Installing publicsuffix-list- 100% | 0.0 B/s | 71.1 KiB | 00m00s [ 8/188] Installing fedora-release-ide 100% | 0.0 B/s | 920.0 B | 00m00s [ 9/188] Installing fedora-repos-rawhi 100% | 2.4 MiB/s | 2.4 KiB | 00m00s [ 10/188] Installing fedora-gpg-keys-0: 100% | 29.6 MiB/s | 182.1 KiB | 00m00s [ 11/188] Installing fedora-repos-0:45- 100% | 0.0 B/s | 5.7 KiB | 00m00s [ 12/188] Installing fedora-release-com 100% | 7.3 MiB/s | 7.4 KiB | 00m00s [ 13/188] Installing fedora-release-0:4 100% | 13.5 KiB/s | 124.0 B | 00m00s >>> Running sysusers scriptlet: setup-0:2.15.0-29.fc45.noarch >>> Finished sysusers scriptlet: setup-0:2.15.0-29.fc45.noarch >>> Scriptlet output: >>> Creating group 'adm' with GID 4. >>> Creating group 'audio' with GID 63. >>> Creating group 'cdrom' with GID 11. >>> Creating group 'clock' with GID 103. >>> Creating group 'dialout' with GID 18. >>> Creating group 'disk' with GID 6. >>> Creating group 'floppy' with GID 19. >>> Creating group 'ftp' with GID 50. >>> Creating group 'games' with GID 20. >>> Creating group 'input' with GID 104. >>> Creating group 'kmem' with GID 9. >>> Creating group 'kvm' with GID 36. >>> Creating group 'lock' with GID 54. >>> Creating group 'lp' with GID 7. >>> Creating group 'mail' with GID 12. >>> Creating group 'man' with GID 15. >>> Creating group 'mem' with GID 8. >>> Creating group 'nobody' with GID 65534. >>> Creating group 'render' with GID 105. >>> Creating group 'root' with GID 0. >>> Creating group 'sgx' with GID 106. >>> Creating group 'sys' with GID 3. >>> Creating group 'tape' with GID 33. >>> Creating group 'tty' with GID 5. >>> Creating group 'users' with GID 100. >>> Creating group 'utmp' with GID 22. >>> Creating group 'video' with GID 39. >>> Creating group 'wheel' with GID 10. >>> Creating user 'adm' (adm) with UID 3 and GID 4. >>> Creating group 'bin' with GID 1. >>> Creating user 'bin' (bin) with UID 1 and GID 1. >>> Creating group 'daemon' with GID 2. >>> Creating user 'daemon' (daemon) with UID 2 and GID 2. >>> Creating user 'ftp' (FTP User) with UID 14 and GID 50. >>> Creating user 'games' (games) with UID 12 and GID 100. >>> Creating user 'halt' (halt) with UID 7 and GID 0. >>> Creating user 'lp' (lp) with UID 4 and GID 7. >>> Creating user 'mail' (mail) with UID 8 and GID 12. >>> Creating user 'nobody' (Kernel Overflow User) with UID 65534 and GID 65534. >>> Creating user 'operator' (operator) with UID 11 and GID 0. >>> Creating user 'root' (Super User) with UID 0 and GID 0. >>> Creating user 'shutdown' (shutdown) with UID 6 and GID 0. >>> Creating user 'sync' (sync) with UID 5 and GID 0. >>> [ 14/188] Installing setup-0:2.15.0-29. 100% | 34.0 MiB/s | 730.6 KiB | 00m00s >>> [RPM] /etc/hosts created as /etc/hosts.rpmnew [ 15/188] Installing filesystem-0:3.18- 100% | 1.9 MiB/s | 289.4 KiB | 00m00s [ 16/188] Installing pkgconf-m4-0:2.5.1 100% | 0.0 B/s | 14.7 KiB | 00m00s [ 17/188] Installing pcre2-syntax-0:10. 100% | 277.7 MiB/s | 284.3 KiB | 00m00s [ 18/188] Installing gnulib-l10n-0:2024 100% | 129.3 MiB/s | 661.9 KiB | 00m00s [ 19/188] Installing coreutils-common-0 100% | 296.9 MiB/s | 10.7 MiB | 00m00s [ 20/188] Installing ncurses-base-0:6.6 100% | 57.8 MiB/s | 355.3 KiB | 00m00s [ 21/188] Installing bash-0:5.3.9-3.fc4 100% | 212.0 MiB/s | 8.5 MiB | 00m00s [ 22/188] Installing glibc-common-0:2.4 100% | 46.8 MiB/s | 1.0 MiB | 00m00s [ 23/188] Installing glibc-gconv-extra- 100% | 215.6 MiB/s | 7.8 MiB | 00m00s [ 24/188] Installing glibc-0:2.43.9000- 100% | 142.6 MiB/s | 7.0 MiB | 00m00s [ 25/188] Installing ncurses-libs-0:6.6 100% | 190.5 MiB/s | 975.4 KiB | 00m00s [ 26/188] Installing glibc-minimal-lang 100% | 0.0 B/s | 124.0 B | 00m00s [ 27/188] Installing zlib-ng-compat-0:2 100% | 162.5 MiB/s | 166.4 KiB | 00m00s [ 28/188] Installing bzip2-libs-0:1.0.8 100% | 79.7 MiB/s | 81.6 KiB | 00m00s [ 29/188] Installing libgpg-error-0:1.6 100% | 44.4 MiB/s | 954.6 KiB | 00m00s [ 30/188] Installing libstdc++-0:16.0.1 100% | 309.3 MiB/s | 3.1 MiB | 00m00s [ 31/188] Installing libassuan-0:2.5.7- 100% | 161.8 MiB/s | 165.7 KiB | 00m00s [ 32/188] Installing libgcrypt-0:1.12.2 100% | 331.7 MiB/s | 1.7 MiB | 00m00s [ 33/188] Installing readline-0:8.3-4.f 100% | 254.7 MiB/s | 521.6 KiB | 00m00s [ 34/188] Installing gmp-1:6.3.0-5.fc44 100% | 266.1 MiB/s | 817.5 KiB | 00m00s [ 35/188] Installing systemd-libs-0:260 100% | 283.0 MiB/s | 2.5 MiB | 00m00s [ 36/188] Installing xz-libs-1:5.8.3-1. 100% | 213.7 MiB/s | 218.8 KiB | 00m00s [ 37/188] Installing libuuid-0:2.42-7.f 100% | 37.3 MiB/s | 38.2 KiB | 00m00s [ 38/188] Installing libzstd-0:1.5.7-5. 100% | 311.6 MiB/s | 957.4 KiB | 00m00s [ 39/188] Installing elfutils-libelf-0: 100% | 290.5 MiB/s | 1.2 MiB | 00m00s [ 40/188] Installing popt-0:1.19-10.fc4 100% | 45.3 MiB/s | 139.3 KiB | 00m00s [ 41/188] Installing npth-0:1.8-4.fc44. 100% | 49.4 MiB/s | 50.6 KiB | 00m00s [ 42/188] Installing elfutils-libs-0:0. 100% | 233.4 MiB/s | 717.1 KiB | 00m00s [ 43/188] Installing libblkid-0:2.42-7. 100% | 276.6 MiB/s | 283.2 KiB | 00m00s [ 44/188] Installing libattr-0:2.5.2-8. 100% | 0.0 B/s | 25.2 KiB | 00m00s [ 45/188] Installing libacl-0:2.3.2-6.f 100% | 0.0 B/s | 36.6 KiB | 00m00s [ 46/188] Installing libsepol-0:3.10-1. 100% | 283.5 MiB/s | 871.0 KiB | 00m00s [ 47/188] Installing sqlite-libs-0:3.52 100% | 267.1 MiB/s | 1.6 MiB | 00m00s [ 48/188] Installing gnupg2-gpgconf-0:2 100% | 13.7 MiB/s | 251.8 KiB | 00m00s [ 49/188] Installing pcre2-0:10.47-1.fc 100% | 234.4 MiB/s | 720.0 KiB | 00m00s [ 50/188] Installing libselinux-0:3.10- 100% | 197.5 MiB/s | 202.3 KiB | 00m00s [ 51/188] Installing grep-0:3.12-3.fc44 100% | 45.6 MiB/s | 1.0 MiB | 00m00s [ 52/188] Installing sed-0:4.10-1.fc45. 100% | 42.1 MiB/s | 947.8 KiB | 00m00s [ 53/188] Installing findutils-1:4.10.0 100% | 77.9 MiB/s | 1.9 MiB | 00m00s [ 54/188] Installing libxcrypt-0:4.5.2- 100% | 144.5 MiB/s | 295.9 KiB | 00m00s [ 55/188] Installing libtasn1-0:4.21.0- 100% | 178.2 MiB/s | 182.4 KiB | 00m00s [ 56/188] Installing libunistring-0:1.1 100% | 289.1 MiB/s | 1.7 MiB | 00m00s [ 57/188] Installing libidn2-0:2.3.8-3. 100% | 49.9 MiB/s | 562.6 KiB | 00m00s [ 58/188] Installing crypto-policies-0: 100% | 22.0 MiB/s | 157.7 KiB | 00m00s [ 59/188] Installing xz-1:5.8.3-1.fc45. 100% | 54.0 MiB/s | 1.4 MiB | 00m00s [ 60/188] Installing libmount-0:2.42-7. 100% | 196.2 MiB/s | 401.9 KiB | 00m00s [ 61/188] Installing gnupg2-verify-0:2. 100% | 19.6 MiB/s | 361.7 KiB | 00m00s [ 62/188] Installing dwz-0:0.16-3.fc44. 100% | 15.9 MiB/s | 292.3 KiB | 00m00s [ 63/188] Installing mpfr-0:4.2.2-3.fc4 100% | 207.7 MiB/s | 850.8 KiB | 00m00s [ 64/188] Installing gawk-0:5.4.0-2.fc4 100% | 89.0 MiB/s | 2.2 MiB | 00m00s [ 65/188] Installing libksba-0:1.6.8-1. 100% | 207.2 MiB/s | 424.4 KiB | 00m00s [ 66/188] Installing unzip-0:6.0-69.fc4 100% | 24.4 MiB/s | 449.3 KiB | 00m00s [ 67/188] Installing file-libs-0:5.47-2 100% | 609.5 MiB/s | 12.2 MiB | 00m00s [ 68/188] Installing file-0:5.47-2.fc45 100% | 5.9 MiB/s | 102.6 KiB | 00m00s [ 69/188] Installing diffutils-0:3.12-5 100% | 68.4 MiB/s | 1.6 MiB | 00m00s [ 70/188] Installing libeconf-0:0.7.9-3 100% | 64.9 MiB/s | 66.4 KiB | 00m00s [ 71/188] Installing libcap-ng-0:0.9.3- 100% | 69.0 MiB/s | 70.6 KiB | 00m00s [ 72/188] Installing audit-libs-0:4.1.4 100% | 191.9 MiB/s | 393.0 KiB | 00m00s [ 73/188] Installing pam-libs-0:1.7.2-1 100% | 129.8 MiB/s | 132.9 KiB | 00m00s [ 74/188] Installing libcap-0:2.78-1.fc 100% | 11.2 MiB/s | 217.2 KiB | 00m00s [ 75/188] Installing libsemanage-0:3.10 100% | 153.4 MiB/s | 314.1 KiB | 00m00s [ 76/188] Installing libsmartcols-0:2.4 100% | 188.9 MiB/s | 193.4 KiB | 00m00s [ 77/188] Installing lua-libs-0:5.5.0-1 100% | 146.2 MiB/s | 299.4 KiB | 00m00s [ 78/188] Installing json-c-0:0.18-8.fc 100% | 81.9 MiB/s | 83.9 KiB | 00m00s [ 79/188] Installing libffi-0:3.5.2-2.f 100% | 87.0 MiB/s | 89.1 KiB | 00m00s [ 80/188] Installing p11-kit-0:0.26.2-1 100% | 92.7 MiB/s | 2.6 MiB | 00m00s [ 81/188] Installing alternatives-0:1.3 100% | 3.5 MiB/s | 63.6 KiB | 00m00s [ 82/188] Installing p11-kit-trust-0:0. 100% | 16.2 MiB/s | 480.0 KiB | 00m00s [ 83/188] Installing ngtcp2-0:1.22.1-1. 100% | 165.9 MiB/s | 339.7 KiB | 00m00s [ 84/188] Installing openssl-libs-1:3.5 100% | 340.3 MiB/s | 9.2 MiB | 00m00s [ 85/188] Installing coreutils-0:9.10-3 100% | 121.1 MiB/s | 5.7 MiB | 00m00s [ 86/188] Installing ca-certificates-0: 100% | 1.5 MiB/s | 2.5 MiB | 00m02s [ 87/188] Installing gzip-0:1.14-2.fc44 100% | 20.9 MiB/s | 407.1 KiB | 00m00s [ 88/188] Installing rpm-sequoia-0:1.10 100% | 295.4 MiB/s | 2.4 MiB | 00m00s [ 89/188] Installing libfsverity-0:1.7- 100% | 28.7 MiB/s | 29.4 KiB | 00m00s [ 90/188] Installing libevent-0:2.1.12- 100% | 239.9 MiB/s | 982.4 KiB | 00m00s [ 91/188] Installing systemd-standalone 100% | 43.3 MiB/s | 841.8 KiB | 00m00s [ 92/188] Installing rpm-libs-0:6.0.1-5 100% | 235.0 MiB/s | 962.8 KiB | 00m00s [ 93/188] Installing ngtcp2-crypto-ossl 100% | 51.3 MiB/s | 52.5 KiB | 00m00s [ 94/188] Installing util-linux-core-0: 100% | 61.2 MiB/s | 1.5 MiB | 00m00s [ 95/188] Installing liblastlog2-0:2.42 100% | 6.1 MiB/s | 43.6 KiB | 00m00s [ 96/188] Installing zip-0:3.0-45.fc44. 100% | 34.3 MiB/s | 701.9 KiB | 00m00s [ 97/188] Installing gnupg2-keyboxd-0:2 100% | 24.7 MiB/s | 202.5 KiB | 00m00s [ 98/188] Installing libpsl-0:0.21.5-7. 100% | 75.6 MiB/s | 77.4 KiB | 00m00s [ 99/188] Installing tar-2:1.35-8.fc44. 100% | 106.4 MiB/s | 3.0 MiB | 00m00s [100/188] Installing linkdupes-0:0.7.3- 100% | 38.2 MiB/s | 782.3 KiB | 00m00s [101/188] Installing libselinux-utils-0 100% | 16.5 MiB/s | 320.1 KiB | 00m00s [102/188] Installing libfdisk-0:2.42-7. 100% | 126.7 MiB/s | 389.3 KiB | 00m00s [103/188] Installing util-linux-0:2.42- 100% | 77.3 MiB/s | 3.7 MiB | 00m00s [104/188] Installing policycoreutils-0: 100% | 25.5 MiB/s | 912.9 KiB | 00m00s [105/188] Installing selinux-policy-0:4 100% | 1.3 MiB/s | 34.5 KiB | 00m00s [106/188] Installing selinux-policy-tar 100% | 139.2 MiB/s | 14.9 MiB | 00m00s [107/188] Installing zstd-0:1.5.7-5.fc4 100% | 23.5 MiB/s | 506.0 KiB | 00m00s [108/188] Installing libxml2-0:2.12.10- 100% | 77.0 MiB/s | 1.8 MiB | 00m00s [109/188] Installing libusb1-0:1.0.29-5 100% | 15.7 MiB/s | 176.9 KiB | 00m00s >>> Running sysusers scriptlet: tpm2-tss-0:4.1.3-9.fc44.x86_64 >>> Finished sysusers scriptlet: tpm2-tss-0:4.1.3-9.fc44.x86_64 >>> Scriptlet output: >>> Creating group 'tss' with GID 59. >>> Creating user 'tss' (Account used for TPM access) with UID 59 and GID 59. >>> [110/188] Installing tpm2-tss-0:4.1.3-9 100% | 233.3 MiB/s | 1.6 MiB | 00m00s [111/188] Installing ima-evm-utils-libs 100% | 60.5 MiB/s | 61.9 KiB | 00m00s [112/188] Installing gnupg2-gpg-agent-0 100% | 25.0 MiB/s | 690.6 KiB | 00m00s [113/188] Installing nettle-0:3.10.1-3. 100% | 259.6 MiB/s | 797.4 KiB | 00m00s [114/188] Installing gnutls-0:3.8.12-1. 100% | 311.3 MiB/s | 3.7 MiB | 00m00s [115/188] Installing bzip2-0:1.0.8-23.f 100% | 5.4 MiB/s | 99.5 KiB | 00m00s [116/188] Installing add-determinism-0: 100% | 94.9 MiB/s | 2.2 MiB | 00m00s [117/188] Installing build-reproducibil 100% | 0.0 B/s | 1.5 KiB | 00m00s [118/188] Installing cpio-0:2.15-9.fc44 100% | 52.5 MiB/s | 1.1 MiB | 00m00s [119/188] Installing ed-0:1.22.5-2.fc45 100% | 8.2 MiB/s | 152.0 KiB | 00m00s [120/188] Installing patch-0:2.8-4.fc44 100% | 12.4 MiB/s | 228.1 KiB | 00m00s [121/188] Installing lz4-libs-0:1.10.0- 100% | 154.6 MiB/s | 158.4 KiB | 00m00s [122/188] Installing libarchive-0:3.8.7 100% | 244.5 MiB/s | 1.0 MiB | 00m00s [123/188] Installing libgomp-0:16.0.1-0 100% | 284.6 MiB/s | 582.8 KiB | 00m00s [124/188] Installing libtool-ltdl-0:2.5 100% | 69.5 MiB/s | 71.1 KiB | 00m00s [125/188] Installing gdbm-libs-1:1.23-1 100% | 128.3 MiB/s | 131.3 KiB | 00m00s [126/188] Installing cyrus-sasl-lib-0:2 100% | 96.3 MiB/s | 2.3 MiB | 00m00s [127/188] Installing openldap-0:2.6.13- 100% | 164.5 MiB/s | 673.7 KiB | 00m00s [128/188] Installing gnupg2-dirmngr-0:2 100% | 23.9 MiB/s | 636.7 KiB | 00m00s [129/188] Installing gnupg2-0:2.4.9-7.f 100% | 171.2 MiB/s | 6.5 MiB | 00m00s [130/188] Installing rpm-sign-libs-0:6. 100% | 39.4 MiB/s | 40.3 KiB | 00m00s [131/188] Installing rpm-build-libs-0:6 100% | 270.7 MiB/s | 277.2 KiB | 00m00s [132/188] Installing gpgverify-0:2.2-4. 100% | 0.0 B/s | 9.4 KiB | 00m00s [133/188] Installing jansson-0:2.14-4.f 100% | 88.2 MiB/s | 90.3 KiB | 00m00s [134/188] Installing libpkgconf-0:2.5.1 100% | 89.1 MiB/s | 91.3 KiB | 00m00s [135/188] Installing pkgconf-0:2.5.1-1. 100% | 5.2 MiB/s | 95.2 KiB | 00m00s [136/188] Installing pkgconf-pkg-config 100% | 104.3 KiB/s | 1.8 KiB | 00m00s [137/188] Installing xxhash-libs-0:0.8. 100% | 93.2 MiB/s | 95.4 KiB | 00m00s [138/188] Installing libbrotli-0:1.2.0- 100% | 282.3 MiB/s | 867.3 KiB | 00m00s [139/188] Installing libnghttp2-0:1.69. 100% | 163.3 MiB/s | 167.3 KiB | 00m00s [140/188] Installing libnghttp3-0:1.15. 100% | 156.8 MiB/s | 160.6 KiB | 00m00s [141/188] Installing keyutils-libs-0:1. 100% | 54.3 MiB/s | 55.6 KiB | 00m00s [142/188] Installing libcom_err-0:1.47. 100% | 62.5 MiB/s | 64.0 KiB | 00m00s [143/188] Installing libverto-0:0.3.2-1 100% | 26.4 MiB/s | 27.1 KiB | 00m00s [144/188] Installing krb5-libs-0:1.22.2 100% | 266.1 MiB/s | 2.4 MiB | 00m00s [145/188] Installing libcbor-0:0.13.0-2 100% | 79.0 MiB/s | 80.9 KiB | 00m00s [146/188] Installing libfido2-0:1.17.0- 100% | 127.9 MiB/s | 262.0 KiB | 00m00s [147/188] Installing libssh-0:0.12.0-1. 100% | 176.1 MiB/s | 721.2 KiB | 00m00s [148/188] Installing libcurl-0:8.20.0~r 100% | 255.5 MiB/s | 1.0 MiB | 00m00s [149/188] Installing elfutils-debuginfo 100% | 4.7 MiB/s | 86.1 KiB | 00m00s [150/188] Installing elfutils-0:0.195-1 100% | 110.2 MiB/s | 3.0 MiB | 00m00s [151/188] Installing binutils-0:2.46.50 100% | 291.5 MiB/s | 28.3 MiB | 00m00s [152/188] Installing gdb-minimal-0:17.1 100% | 249.0 MiB/s | 14.2 MiB | 00m00s [153/188] Installing debugedit-0:5.3-2. 100% | 12.2 MiB/s | 224.1 KiB | 00m00s [154/188] Installing curl-0:8.20.0~rc2- 100% | 17.6 MiB/s | 485.3 KiB | 00m00s [155/188] Installing rpm-0:6.0.1-5.fc45 100% | 59.3 MiB/s | 2.6 MiB | 00m00s [156/188] Installing cmake-srpm-macros- 100% | 0.0 B/s | 804.0 B | 00m00s [157/188] Installing efi-srpm-macros-0: 100% | 40.2 MiB/s | 41.2 KiB | 00m00s [158/188] Installing java-srpm-macros-0 100% | 0.0 B/s | 1.1 KiB | 00m00s [159/188] Installing lua-srpm-macros-0: 100% | 0.0 B/s | 1.9 KiB | 00m00s [160/188] Installing tree-sitter-srpm-m 100% | 1.5 MiB/s | 9.3 KiB | 00m00s [161/188] Installing zig-srpm-macros-0: 100% | 0.0 B/s | 1.9 KiB | 00m00s [162/188] Installing filesystem-srpm-ma 100% | 0.0 B/s | 38.9 KiB | 00m00s [163/188] Installing rust-srpm-macros-0 100% | 0.0 B/s | 6.4 KiB | 00m00s [164/188] Installing qt6-srpm-macros-0: 100% | 0.0 B/s | 748.0 B | 00m00s [165/188] Installing qt5-srpm-macros-0: 100% | 0.0 B/s | 776.0 B | 00m00s [166/188] Installing perl-srpm-macros-0 100% | 0.0 B/s | 1.1 KiB | 00m00s [167/188] Installing package-notes-srpm 100% | 0.0 B/s | 2.1 KiB | 00m00s [168/188] Installing openblas-srpm-macr 100% | 0.0 B/s | 392.0 B | 00m00s [169/188] Installing ocaml-srpm-macros- 100% | 0.0 B/s | 2.1 KiB | 00m00s [170/188] Installing kernel-srpm-macros 100% | 0.0 B/s | 2.3 KiB | 00m00s [171/188] Installing gnat-srpm-macros-0 100% | 0.0 B/s | 1.3 KiB | 00m00s [172/188] Installing ghc-srpm-macros-0: 100% | 0.0 B/s | 1.0 KiB | 00m00s [173/188] Installing gap-srpm-macros-0: 100% | 0.0 B/s | 2.7 KiB | 00m00s [174/188] Installing fpc-srpm-macros-0: 100% | 0.0 B/s | 420.0 B | 00m00s [175/188] Installing ansible-srpm-macro 100% | 0.0 B/s | 36.2 KiB | 00m00s [176/188] Installing rpm-build-0:6.0.1- 100% | 14.1 MiB/s | 303.7 KiB | 00m00s [177/188] Installing erlang-srpm-macros 100% | 0.0 B/s | 2.5 KiB | 00m00s [178/188] Installing pyproject-srpm-mac 100% | 2.4 MiB/s | 2.5 KiB | 00m00s [179/188] Installing redhat-rpm-config- 100% | 92.7 MiB/s | 189.9 KiB | 00m00s [180/188] Installing forge-srpm-macros- 100% | 39.3 MiB/s | 40.3 KiB | 00m00s [181/188] Installing fonts-srpm-macros- 100% | 55.7 MiB/s | 57.0 KiB | 00m00s [182/188] Installing go-srpm-macros-0:3 100% | 61.6 MiB/s | 63.0 KiB | 00m00s [183/188] Installing R-srpm-macros-0:1. 100% | 0.0 B/s | 4.4 KiB | 00m00s [184/188] Installing python-srpm-macros 100% | 51.7 MiB/s | 52.9 KiB | 00m00s [185/188] Installing rpm-plugin-selinux 100% | 0.0 B/s | 12.9 KiB | 00m00s [186/188] Installing which-0:2.23-4.fc4 100% | 4.4 MiB/s | 85.6 KiB | 00m00s [187/188] Installing shadow-utils-2:4.1 100% | 100.7 MiB/s | 4.0 MiB | 00m00s [188/188] Installing info-0:7.3-1.fc45. 100% | 45.3 KiB/s | 372.8 KiB | 00m08s Complete! Finish: installing minimal buildroot with dnf5 Start: creating root cache Finish: creating root cache Finish: chroot init INFO: Installed packages: INFO: R-srpm-macros-1.3.7-1.fc45.noarch add-determinism-0.7.3-2.fc45.x86_64 alternatives-1.33-5.fc44.x86_64 ansible-srpm-macros-1-20.1.fc44.noarch audit-libs-4.1.4-1.fc45.x86_64 bash-5.3.9-3.fc44.x86_64 binutils-2.46.50-7.fc45.x86_64 build-reproducibility-srpm-macros-0.7.3-2.fc45.noarch bzip2-1.0.8-23.fc44.x86_64 bzip2-libs-1.0.8-23.fc44.x86_64 ca-certificates-2025.2.80_v9.0.304-7.fc45.noarch cmake-srpm-macros-4.3.0-1.fc45.noarch coreutils-9.10-3.fc45.x86_64 coreutils-common-9.10-3.fc45.x86_64 cpio-2.15-9.fc44.x86_64 crypto-policies-20251128-3.git19878fe.fc44.noarch curl-8.20.0~rc2-1.fc45.x86_64 cyrus-sasl-lib-2.1.28-35.fc44.x86_64 debugedit-5.3-2.fc45.x86_64 diffutils-3.12-5.fc44.x86_64 dwz-0.16-3.fc44.x86_64 ed-1.22.5-2.fc45.x86_64 efi-srpm-macros-6-6.fc44.noarch elfutils-0.195-1.fc45.x86_64 elfutils-debuginfod-client-0.195-1.fc45.x86_64 elfutils-libelf-0.195-1.fc45.x86_64 elfutils-libs-0.195-1.fc45.x86_64 erlang-srpm-macros-0.3.11-1.fc45.noarch fedora-gpg-keys-45-0.1.noarch fedora-release-45-0.5.noarch fedora-release-common-45-0.5.noarch fedora-release-identity-basic-45-0.5.noarch fedora-repos-45-0.1.noarch fedora-repos-rawhide-45-0.1.noarch file-5.47-2.fc45.x86_64 file-libs-5.47-2.fc45.x86_64 filesystem-3.18-56.fc45.x86_64 filesystem-srpm-macros-3.18-56.fc45.noarch findutils-4.10.0-7.fc44.x86_64 fonts-srpm-macros-5.0.0-3.fc45.noarch forge-srpm-macros-0.4.0-4.fc44.noarch fpc-srpm-macros-1.3-16.fc44.noarch gap-srpm-macros-2-2.fc44.noarch gawk-5.4.0-2.fc45.x86_64 gdb-minimal-17.1-5.fc45.x86_64 gdbm-libs-1.23-11.fc44.x86_64 ghc-srpm-macros-1.10-1.fc44.noarch glibc-2.43.9000-12.fc45.x86_64 glibc-common-2.43.9000-12.fc45.x86_64 glibc-gconv-extra-2.43.9000-12.fc45.x86_64 glibc-minimal-langpack-2.43.9000-12.fc45.x86_64 gmp-6.3.0-5.fc44.x86_64 gnat-srpm-macros-7-2.fc44.noarch gnulib-l10n-20241231-2.fc44.noarch gnupg2-2.4.9-7.fc45.x86_64 gnupg2-dirmngr-2.4.9-7.fc45.x86_64 gnupg2-gpg-agent-2.4.9-7.fc45.x86_64 gnupg2-gpgconf-2.4.9-7.fc45.x86_64 gnupg2-keyboxd-2.4.9-7.fc45.x86_64 gnupg2-verify-2.4.9-7.fc45.x86_64 gnutls-3.8.12-1.fc45.x86_64 go-srpm-macros-3.8.0-2.fc44.noarch gpg-pubkey-36f612dcf27f7d1a48a835e4dbfcf71c6d9f90a6-6786af3b gpg-pubkey-4f50a6114cd5c6976a7f1179655a4b02f577861e-6888bc98 gpg-pubkey-d924b10d3e810dabdd8b56b596e7e91491211fce-697c9899 gpgverify-2.2-4.fc44.noarch grep-3.12-3.fc44.x86_64 gzip-1.14-2.fc44.x86_64 ima-evm-utils-libs-1.6.2-10.fc45.x86_64 info-7.3-1.fc45.x86_64 jansson-2.14-4.fc44.x86_64 java-srpm-macros-1-8.fc44.noarch json-c-0.18-8.fc44.x86_64 kernel-srpm-macros-1.0-29.fc45.noarch keyutils-libs-1.6.3-7.fc44.x86_64 krb5-libs-1.22.2-6.fc45.x86_64 libacl-2.3.2-6.fc44.x86_64 libarchive-3.8.7-1.fc45.x86_64 libassuan-2.5.7-5.fc44.x86_64 libattr-2.5.2-8.fc44.x86_64 libblkid-2.42-7.fc45.x86_64 libbrotli-1.2.0-3.fc44.x86_64 libcap-2.78-1.fc45.x86_64 libcap-ng-0.9.3-1.fc45.x86_64 libcbor-0.13.0-2.fc44.x86_64 libcom_err-1.47.4-1.fc45.x86_64 libcurl-8.20.0~rc2-1.fc45.x86_64 libeconf-0.7.9-3.fc44.x86_64 libevent-2.1.12-17.fc44.x86_64 libfdisk-2.42-7.fc45.x86_64 libffi-3.5.2-2.fc44.x86_64 libfido2-1.17.0-1.fc45.x86_64 libfsverity-1.7-1.fc45.x86_64 libgcc-16.0.1-0.11.fc45.x86_64 libgcrypt-1.12.2-1.fc45.x86_64 libgomp-16.0.1-0.11.fc45.x86_64 libgpg-error-1.60-1.fc45.x86_64 libidn2-2.3.8-3.fc44.x86_64 libksba-1.6.8-1.fc45.x86_64 liblastlog2-2.42-7.fc45.x86_64 libmount-2.42-7.fc45.x86_64 libnghttp2-1.69.0-1.fc45.x86_64 libnghttp3-1.15.0-1.fc44.x86_64 libpkgconf-2.5.1-1.fc45.x86_64 libpsl-0.21.5-7.fc44.x86_64 libselinux-3.10-1.fc44.x86_64 libselinux-utils-3.10-1.fc44.x86_64 libsemanage-3.10-1.fc44.x86_64 libsepol-3.10-1.fc44.x86_64 libsmartcols-2.42-7.fc45.x86_64 libssh-0.12.0-1.fc45.x86_64 libssh-config-0.12.0-1.fc45.noarch libstdc++-16.0.1-0.11.fc45.x86_64 libtasn1-4.21.0-1.fc45.x86_64 libtool-ltdl-2.5.4-10.fc44.x86_64 libunistring-1.1-11.fc44.x86_64 libusb1-1.0.29-5.fc44.x86_64 libuuid-2.42-7.fc45.x86_64 libverto-0.3.2-12.fc44.x86_64 libxcrypt-4.5.2-3.fc44.x86_64 libxml2-2.12.10-6.fc44.x86_64 libzstd-1.5.7-5.fc44.x86_64 linkdupes-0.7.3-2.fc45.x86_64 lua-libs-5.5.0-1.fc45.x86_64 lua-srpm-macros-1-17.fc44.noarch lz4-libs-1.10.0-4.fc44.x86_64 mpfr-4.2.2-3.fc44.x86_64 ncurses-base-6.6-1.fc44.noarch ncurses-libs-6.6-1.fc44.x86_64 nettle-3.10.1-3.fc44.x86_64 ngtcp2-1.22.1-1.fc45.x86_64 ngtcp2-crypto-ossl-1.22.1-1.fc45.x86_64 npth-1.8-4.fc44.x86_64 ocaml-srpm-macros-11-3.fc44.noarch openblas-srpm-macros-2-21.fc44.noarch openldap-2.6.13-1.fc45.x86_64 openssl-libs-3.5.6-1.fc45.x86_64 p11-kit-0.26.2-1.fc45.x86_64 p11-kit-trust-0.26.2-1.fc45.x86_64 package-notes-srpm-macros-0.17-3.fc45.noarch pam-libs-1.7.2-1.fc44.x86_64 patch-2.8-4.fc44.x86_64 pcre2-10.47-1.fc44.1.x86_64 pcre2-syntax-10.47-1.fc44.1.noarch perl-srpm-macros-1-61.fc44.noarch pkgconf-2.5.1-1.fc45.x86_64 pkgconf-m4-2.5.1-1.fc45.noarch pkgconf-pkg-config-2.5.1-1.fc45.x86_64 policycoreutils-3.10-3.fc45.x86_64 popt-1.19-10.fc44.x86_64 publicsuffix-list-dafsa-20260116-1.fc44.noarch pyproject-srpm-macros-1.20.0-1.fc45.noarch python-srpm-macros-3.14-12.fc45.noarch qt5-srpm-macros-5.15.18-2.fc44.noarch qt6-srpm-macros-6.11.0-1.fc45.noarch readline-8.3-4.fc44.x86_64 redhat-rpm-config-344-1.fc45.noarch redhat-systemd-presets-102-1.fc45.noarch redhat-systemd-presets-common-102-1.fc45.noarch rpm-6.0.1-5.fc45.x86_64 rpm-build-6.0.1-5.fc45.x86_64 rpm-build-libs-6.0.1-5.fc45.x86_64 rpm-libs-6.0.1-5.fc45.x86_64 rpm-plugin-selinux-6.0.1-5.fc45.x86_64 rpm-sequoia-1.10.2-1.fc45.x86_64 rpm-sign-libs-6.0.1-5.fc45.x86_64 rust-srpm-macros-28.4-3.fc44.noarch sed-4.10-1.fc45.x86_64 selinux-policy-45.1-1.fc45.noarch selinux-policy-targeted-45.1-1.fc45.noarch setup-2.15.0-29.fc45.noarch shadow-utils-4.19.3-2.fc45.x86_64 sqlite-libs-3.52.0-1.fc45.x86_64 systemd-libs-260.1-2.fc45.x86_64 systemd-standalone-sysusers-260.1-2.fc45.x86_64 tar-1.35-8.fc44.x86_64 tpm2-tss-4.1.3-9.fc44.x86_64 tree-sitter-srpm-macros-0.4.2-2.fc44.noarch unzip-6.0-69.fc44.x86_64 util-linux-2.42-7.fc45.x86_64 util-linux-core-2.42-7.fc45.x86_64 which-2.23-4.fc44.x86_64 xxhash-libs-0.8.3-4.fc44.x86_64 xz-5.8.3-1.fc45.x86_64 xz-libs-5.8.3-1.fc45.x86_64 zig-srpm-macros-1-8.fc44.noarch zip-3.0-45.fc44.x86_64 zlib-ng-compat-2.3.3-5.fc45.x86_64 zstd-1.5.7-5.fc44.x86_64 Start: buildsrpm Start: rpmbuild -bs Building target platforms: x86_64 Building for target x86_64 setting SOURCE_DATE_EPOCH=1774483200 Wrote: /builddir/build/SRPMS/python-seaborn-0.13.2-18.fc45.src.rpm Finish: rpmbuild -bs INFO: chroot_scan: 1 files copied to /var/lib/copr-rpmbuild/results/chroot_scan INFO: /var/lib/mock/fedora-rawhide-x86_64-1777362911.442062/root/var/log/dnf5.log INFO: chroot_scan: creating tarball /var/lib/copr-rpmbuild/results/chroot_scan.tar.gz /bin/tar: Removing leading `/' from member names Finish: buildsrpm INFO: Done(/var/lib/copr-rpmbuild/workspace/workdir-vf8ff3ev/python-seaborn/python-seaborn.spec) Config(child) 0 minutes 22 seconds INFO: Results and/or logs in: /var/lib/copr-rpmbuild/results INFO: Cleaning up build root ('cleanup_on_success=True') Start: clean chroot INFO: unmounting tmpfs. Finish: clean chroot INFO: Start(/var/lib/copr-rpmbuild/results/python-seaborn-0.13.2-18.fc45.src.rpm) Config(fedora-rawhide-x86_64) Start(bootstrap): chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-x86_64-bootstrap-1777362911.442062/root. INFO: reusing tmpfs at /var/lib/mock/fedora-rawhide-x86_64-bootstrap-1777362911.442062/root. INFO: calling preinit hooks INFO: enabled root cache INFO: enabled package manager cache Start(bootstrap): cleaning package manager metadata Finish(bootstrap): cleaning package manager metadata Finish(bootstrap): chroot init Start: chroot init INFO: mounting tmpfs at /var/lib/mock/fedora-rawhide-x86_64-1777362911.442062/root. INFO: calling preinit hooks INFO: enabled root cache Start: unpacking root cache Finish: unpacking root cache INFO: enabled package manager cache Start: cleaning package manager metadata Finish: cleaning package manager metadata INFO: enabled HW Info plugin INFO: Buildroot is handled by package management downloaded with a bootstrap image: rpm-6.0.1-5.fc45.x86_64 rpm-sequoia-1.10.2-1.fc45.x86_64 dnf5-5.4.2.0-1.fc45.x86_64 dnf5-plugins-5.4.2.0-1.fc45.x86_64 Finish: chroot init Start: build phase for python-seaborn-0.13.2-18.fc45.src.rpm Start: build setup for python-seaborn-0.13.2-18.fc45.src.rpm Building target platforms: x86_64 Building for target x86_64 setting SOURCE_DATE_EPOCH=1774483200 Wrote: /builddir/build/SRPMS/python-seaborn-0.13.2-18.fc45.src.rpm Updating and loading repositories: fedora 100% | 80.5 KiB/s | 37.0 KiB | 00m00s Copr repository 100% | 4.2 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package Arch Version Repository Size Installing: python3-devel x86_64 0:3.14.4-2.fc45 fedora 2.0 MiB python3-flit-core noarch 0:3.12.0-10.fc44 fedora 265.0 KiB python3-husl noarch 0:4.0.3-38.fc44 fedora 21.5 KiB python3-numpydoc noarch 0:1.9.0-2.fc45 copr_base 682.3 KiB python3-pytest noarch 0:9.0.3-1.fc45 fedora 23.4 MiB Installing dependencies: expat x86_64 0:2.7.5-1.fc45 fedora 322.9 KiB mpdecimal x86_64 0:4.0.1-3.fc44 fedora 217.1 KiB pyproject-rpm-macros noarch 0:1.20.0-1.fc45 fedora 132.4 KiB python-pip-wheel noarch 0:26.0.1-2.fc45 fedora 1.2 MiB python-rpm-macros noarch 0:3.14-12.fc45 fedora 28.0 KiB python3 x86_64 0:3.14.4-2.fc45 fedora 28.7 KiB python3-babel noarch 0:2.18.0-1.fc45 fedora 30.3 MiB python3-charset-normalizer noarch 0:3.4.7-1.fc45 fedora 405.6 KiB python3-docutils noarch 0:0.22.4-2.fc44 fedora 5.5 MiB python3-idna noarch 0:3.13-1.fc45 fedora 597.5 KiB python3-imagesize noarch 0:2.0.0-1.fc45 fedora 82.4 KiB python3-iniconfig noarch 0:2.3.0-2.fc44 fedora 49.8 KiB python3-jinja2 noarch 0:3.1.6-7.fc44 fedora 3.1 MiB python3-libs x86_64 0:3.14.4-2.fc45 fedora 43.8 MiB python3-markupsafe x86_64 0:3.0.2-7.fc44 fedora 61.4 KiB python3-packaging noarch 0:26.1-1.fc45 fedora 958.1 KiB python3-pluggy noarch 0:1.6.0-5.fc44 fedora 211.5 KiB python3-pygments noarch 0:2.19.2-1.fc45 fedora 11.3 MiB python3-requests noarch 0:2.33.1-1.fc45 fedora 477.3 KiB python3-roman-numerals noarch 0:4.1.0-1.fc45 fedora 40.8 KiB python3-rpm-generators noarch 0:14-15.fc45 fedora 81.7 KiB python3-rpm-macros noarch 0:3.14-12.fc45 fedora 6.5 KiB python3-snowballstemmer noarch 0:3.0.1-11.fc44 fedora 1.8 MiB python3-sphinx noarch 1:9.1.0-1.fc45 fedora 13.8 MiB python3-sphinx-theme-alabaster noarch 0:0.7.16-12.fc44 fedora 42.0 KiB python3-urllib3 noarch 0:2.6.3-3.fc45 fedora 1.1 MiB tzdata noarch 0:2026a-1.fc45 fedora 1.2 MiB Transaction Summary: Installing: 32 packages Total size of inbound packages is 31 MiB. Need to download 269 KiB. After this operation, 143 MiB extra will be used (install 143 MiB, remove 0 B). [ 1/32] python3-pytest-0:9.0.3-1.fc45.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 2/32] python3-devel-0:3.14.4-2.fc45.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 3/32] python3-iniconfig-0:2.3.0-2.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 4/32] python3-packaging-0:26.1-1.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 5/32] python3-pluggy-0:1.6.0-5.fc44.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 6/32] python3-pygments-0:2.19.2-1.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 7/32] python3-libs-0:3.14.4-2.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 8/32] python3-sphinx-1:9.1.0-1.fc45.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 9/32] expat-0:2.7.5-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [10/32] mpdecimal-0:4.0.1-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [11/32] python-pip-wheel-0:26.0.1-2.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [12/32] tzdata-0:2026a-1.fc45.noarch 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [13/32] python3-babel-0:2.18.0-1.fc45.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [14/32] python3-docutils-0:0.22.4-2.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [15/32] python3-imagesize-0:2.0.0-1.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [16/32] python3-jinja2-0:3.1.6-7.fc44.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [17/32] python3-requests-0:2.33.1-1.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [18/32] python3-roman-numerals-0:4.1.0- 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [19/32] python3-snowballstemmer-0:3.0.1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [20/32] python3-sphinx-theme-alabaster- 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [21/32] python3-markupsafe-0:3.0.2-7.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [22/32] python3-charset-normalizer-0:3. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [23/32] python3-idna-0:3.13-1.fc45.noar 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [24/32] python3-urllib3-0:2.6.3-3.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [25/32] python3-0:3.14.4-2.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [26/32] pyproject-rpm-macros-0:1.20.0-1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [27/32] python-rpm-macros-0:3.14-12.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [28/32] python3-rpm-generators-0:14-15. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [29/32] python3-rpm-macros-0:3.14-12.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [30/32] python3-husl-0:4.0.3-38.fc44.no 100% | 9.1 MiB/s | 18.6 KiB | 00m00s [31/32] python3-flit-core-0:3.12.0-10.f 100% | 22.2 MiB/s | 91.0 KiB | 00m00s [32/32] python3-numpydoc-0:1.9.0-2.fc45 100% | 386.5 KiB/s | 159.6 KiB | 00m00s -------------------------------------------------------------------------------- [32/32] Total 100% | 507.0 KiB/s | 269.2 KiB | 00m01s Running transaction [ 1/34] Verify package files 100% | 275.0 B/s | 32.0 B | 00m00s [ 2/34] Prepare transaction 100% | 415.0 B/s | 32.0 B | 00m00s [ 3/34] Installing python-rpm-macros-0: 100% | 28.2 MiB/s | 28.9 KiB | 00m00s [ 4/34] Installing python3-rpm-macros-0 100% | 0.0 B/s | 6.8 KiB | 00m00s [ 5/34] Installing pyproject-rpm-macros 100% | 26.3 MiB/s | 134.7 KiB | 00m00s [ 6/34] Installing tzdata-0:2026a-1.fc4 100% | 30.9 MiB/s | 1.5 MiB | 00m00s [ 7/34] Installing python-pip-wheel-0:2 100% | 611.4 MiB/s | 1.2 MiB | 00m00s [ 8/34] Installing mpdecimal-0:4.0.1-3. 100% | 213.5 MiB/s | 218.6 KiB | 00m00s [ 9/34] Installing expat-0:2.7.5-1.fc45 100% | 12.2 MiB/s | 325.0 KiB | 00m00s [10/34] Installing python3-libs-0:3.14. 100% | 269.5 MiB/s | 44.2 MiB | 00m00s [11/34] Installing python3-0:3.14.4-2.f 100% | 1.7 MiB/s | 30.5 KiB | 00m00s [12/34] Installing python3-packaging-0: 100% | 135.6 MiB/s | 972.3 KiB | 00m00s [13/34] Installing python3-pygments-0:2 100% | 167.0 MiB/s | 11.5 MiB | 00m00s [14/34] Installing python3-idna-0:3.13- 100% | 196.6 MiB/s | 604.0 KiB | 00m00s [15/34] Installing python3-urllib3-0:2. 100% | 189.6 MiB/s | 1.1 MiB | 00m00s [16/34] Installing python3-rpm-generato 100% | 80.9 MiB/s | 82.9 KiB | 00m00s [17/34] Installing python3-iniconfig-0: 100% | 52.9 MiB/s | 54.1 KiB | 00m00s [18/34] Installing python3-pluggy-0:1.6 100% | 42.6 MiB/s | 217.9 KiB | 00m00s [19/34] Installing python3-babel-0:2.18 100% | 274.9 MiB/s | 30.5 MiB | 00m00s [20/34] Installing python3-docutils-0:0 100% | 124.1 MiB/s | 5.6 MiB | 00m00s [21/34] Installing python3-imagesize-0: 100% | 83.5 MiB/s | 85.5 KiB | 00m00s [22/34] Installing python3-roman-numera 100% | 42.7 MiB/s | 43.7 KiB | 00m00s [23/34] Installing python3-snowballstem 100% | 265.7 MiB/s | 1.9 MiB | 00m00s [24/34] Installing python3-sphinx-theme 100% | 45.5 MiB/s | 46.6 KiB | 00m00s [25/34] Installing python3-markupsafe-0 100% | 32.1 MiB/s | 65.7 KiB | 00m00s [26/34] Installing python3-jinja2-0:3.1 100% | 309.2 MiB/s | 3.1 MiB | 00m00s [27/34] Installing python3-charset-norm 100% | 20.3 MiB/s | 415.8 KiB | 00m00s [28/34] Installing python3-requests-0:2 100% | 79.7 MiB/s | 489.5 KiB | 00m00s [29/34] Installing python3-sphinx-1:9.1 100% | 155.8 MiB/s | 14.0 MiB | 00m00s [30/34] Installing python3-numpydoc-0:1 100% | 29.6 MiB/s | 697.7 KiB | 00m00s [31/34] Installing python3-pytest-0:9.0 100% | 287.9 MiB/s | 23.6 MiB | 00m00s [32/34] Installing python3-devel-0:3.14 100% | 66.8 MiB/s | 2.0 MiB | 00m00s [33/34] Installing python3-flit-core-0: 100% | 132.4 MiB/s | 271.2 KiB | 00m00s [34/34] Installing python3-husl-0:4.0.3 100% | 695.9 KiB/s | 23.7 KiB | 00m00s Warning: skipped OpenPGP checks for 1 package from repository: copr_base Complete! Building target platforms: x86_64 Building for target x86_64 setting SOURCE_DATE_EPOCH=1774483200 Wrote: /builddir/build/SRPMS/python-seaborn-0.13.2-18.fc45.src.rpm Updating and loading repositories: fedora 100% | 79.1 KiB/s | 37.0 KiB | 00m00s Copr repository 100% | 4.0 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package "python3-devel-3.14.4-2.fc45.x86_64" is already installed. Nothing to do. Package "python3-flit-core-3.12.0-10.fc44.noarch" is already installed. Package "python3-husl-4.0.3-38.fc44.noarch" is already installed. Package "python3-numpydoc-1.9.0-2.fc45.noarch" is already installed. Package "python3-pytest-9.0.3-1.fc45.noarch" is already installed. Finish: build setup for python-seaborn-0.13.2-18.fc45.src.rpm Start: rpmbuild python-seaborn-0.13.2-18.fc45.src.rpm Building target platforms: x86_64 Building for target x86_64 setting SOURCE_DATE_EPOCH=1774483200 Executing(%mkbuilddir): /bin/sh -e /var/tmp/rpm-tmp.lHI2ea Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.v6BkFN + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + rm -rf seaborn/external/husl.py + rm -rf seaborn/external/docscrape.py + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + rm -rf seaborn-0.13.2 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/seaborn-0.13.2.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd seaborn-0.13.2 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/seaborn-husl.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/seaborn-docscrape.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/seaborn-numpy-removals.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/pytest9.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.JvEl9J + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + cd seaborn-0.13.2 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir --output /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-buildrequires --dep-overrides-file /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-dep-overrides -x stats Handling flit_core >=3.2,<4 from build-system.requires Requirement satisfied: flit_core >=3.2,<4 (installed: flit_core 3.12.0) Handling numpy>=1.20,!=1.24.0 from hook generated metadata: Requires-Dist (seaborn) Requirement not satisfied: numpy>=1.20,!=1.24.0 Handling pandas>=1.2 from hook generated metadata: Requires-Dist (seaborn) Requirement not satisfied: pandas>=1.2 Handling matplotlib>=3.4,!=3.6.1 from hook generated metadata: Requires-Dist (seaborn) Requirement not satisfied: matplotlib>=3.4,!=3.6.1 Handling pytest ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest ; extra == "dev" Handling pytest-cov ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest-cov ; extra == "dev" Handling pytest-xdist ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest-xdist ; extra == "dev" Handling flake8 ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: flake8 ; extra == "dev" Handling mypy ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: mypy ; extra == "dev" Handling pandas-stubs ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pandas-stubs ; extra == "dev" Handling pre-commit ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pre-commit ; extra == "dev" Handling flit ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: flit ; extra == "dev" Handling numpydoc ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: numpydoc ; extra == "docs" Handling nbconvert ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: nbconvert ; extra == "docs" Handling ipykernel ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: ipykernel ; extra == "docs" Handling sphinx<6.0.0 ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx<6.0.0 ; extra == "docs" Handling sphinx-copybutton ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-copybutton ; extra == "docs" Handling sphinx-issues ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-issues ; extra == "docs" Handling sphinx-design ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-design ; extra == "docs" Handling pyyaml ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pyyaml ; extra == "docs" Handling pydata_sphinx_theme==0.10.0rc2 ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pydata_sphinx_theme==0.10.0rc2 ; extra == "docs" Handling scipy>=1.7 ; extra == "stats" from hook generated metadata: Requires-Dist (seaborn) Requirement not satisfied: scipy>=1.7 ; extra == "stats" Handling statsmodels>=0.12 ; extra == "stats" from hook generated metadata: Requires-Dist (seaborn) Requirement not satisfied: statsmodels>=0.12 ; extra == "stats" + cat /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-buildrequires + rm -rfv seaborn-0.13.2.dist-info/ removed 'seaborn-0.13.2.dist-info/METADATA' removed 'seaborn-0.13.2.dist-info/WHEEL' removed directory 'seaborn-0.13.2.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-seaborn-0.13.2-18.fc45.buildreqs.nosrc.rpm INFO: Going to install missing dynamic buildrequires Updating and loading repositories: fedora 100% | 113.2 KiB/s | 37.0 KiB | 00m00s Copr repository 100% | 6.4 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package "pyproject-rpm-macros-1.20.0-1.fc45.noarch" is already installed. Package Arch Version Repository Size Installing: python3-matplotlib x86_64 0:3.11.0~rc1-2.fc45 copr_base 34.3 MiB python3-numpy x86_64 1:2.4.4-1.fc45 fedora 41.3 MiB python3-pandas x86_64 0:2.3.3-3.fc44 fedora 41.7 MiB python3-pip noarch 0:26.0.1-2.fc45 fedora 11.4 MiB python3-scipy x86_64 0:1.16.2-3.fc44 fedora 64.5 MiB python3-statsmodels x86_64 0:0.14.6-4.fc45 fedora 53.2 MiB Installing dependencies: abattis-cantarell-vf-fonts noarch 0:0.301-17.fc44 fedora 192.7 KiB cairo x86_64 0:1.18.4-6.fc44 fedora 1.8 MiB default-fonts-core-sans noarch 0:4.3-1.fc45 fedora 11.9 KiB dejavu-sans-fonts noarch 0:2.37-29.fc44 fedora 5.5 MiB flexiblas x86_64 0:3.5.0-2.fc44 fedora 38.0 KiB flexiblas-netlib x86_64 0:3.5.0-2.fc44 fedora 16.3 MiB flexiblas-openblas-openmp x86_64 0:3.5.0-2.fc44 fedora 39.1 KiB fontconfig x86_64 0:2.17.0-4.fc44 fedora 776.4 KiB fonts-filesystem noarch 1:5.0.0-3.fc45 fedora 0.0 B freetype x86_64 0:2.14.3-1.fc45 fedora 918.3 KiB fribidi x86_64 0:1.0.16-4.fc44 fedora 190.0 KiB glib2 x86_64 0:2.88.0-1.fc45 fedora 15.3 MiB google-noto-fonts-common noarch 0:20260401-1.fc45 fedora 17.7 KiB google-noto-sans-vf-fonts noarch 0:20260401-1.fc45 fedora 1.4 MiB graphite2 x86_64 0:1.3.14-20.fc44 fedora 191.5 KiB harfbuzz x86_64 0:14.2.0-1.fc45 fedora 2.8 MiB jbigkit-libs x86_64 0:2.1-33.fc44 fedora 117.2 KiB lcms2 x86_64 0:2.16-7.fc44 fedora 445.7 KiB libX11 x86_64 0:1.8.13-1.fc45 fedora 1.3 MiB libX11-common noarch 0:1.8.13-1.fc45 fedora 1.1 MiB libXau x86_64 0:1.0.12-4.fc44 fedora 72.8 KiB libXext x86_64 0:1.3.6-5.fc44 fedora 89.8 KiB libXrender x86_64 0:0.9.12-4.fc44 fedora 49.9 KiB libgfortran x86_64 0:16.0.1-0.11.fc45 fedora 3.4 MiB libimagequant x86_64 0:4.1.0-2.fc44 fedora 707.6 KiB libjpeg-turbo x86_64 0:3.1.4.1-1.fc45 fedora 825.7 KiB liblerc x86_64 0:4.1.0-1.fc45 fedora 640.4 KiB libpng x86_64 2:1.6.56-1.fc45 fedora 249.6 KiB libqhull_r x86_64 1:8.0.2-8.fc44 fedora 491.2 KiB libquadmath x86_64 0:16.0.1-0.11.fc45 fedora 325.9 KiB libraqm x86_64 0:0.10.5-1.fc45 copr_base 28.5 KiB libtiff x86_64 0:4.7.1-2.fc44 fedora 640.2 KiB libwebp x86_64 0:1.6.0-3.fc44 fedora 968.0 KiB libxcb x86_64 0:1.17.0-7.fc44 fedora 1.1 MiB lzo x86_64 0:2.10-16.fc44 fedora 174.8 KiB openblas x86_64 0:0.3.29-3.fc45 fedora 111.7 KiB openblas-openmp x86_64 0:0.3.29-3.fc45 fedora 44.5 MiB openjpeg x86_64 0:2.5.4-3.fc44 fedora 464.2 KiB pixman x86_64 0:0.46.2-3.fc44 fedora 718.2 KiB python3-Bottleneck x86_64 0:1.6.0-2.fc44 fedora 573.3 KiB python3-cairo x86_64 0:1.28.0-5.fc44 fedora 508.8 KiB python3-contourpy x86_64 0:1.3.3-3.fc44 fedora 856.7 KiB python3-cycler noarch 0:0.11.0-20.fc44 fedora 37.8 KiB python3-Package "python3-devel-3.14.4-2.fc45.x86_64" is already installed. Package "python3-flit-core-3.12.0-10.fc44.noarch" is already installed. Package "python3-husl-4.0.3-38.fc44.noarch" is already installed. Package "python3-numpydoc-1.9.0-2.fc45.noarch" is already installed. Package "python3-packaging-26.1-1.fc45.noarch" is already installed. Package "python3-pytest-9.0.3-1.fc45.noarch" is already installed. dateutil noarch 1:2.9.0.post0-10.fc45 fedora 878.0 KiB python3-fonttools x86_64 0:4.62.1-1.fc45 copr_base 19.4 MiB python3-kiwisolver x86_64 0:1.5.0-1.fc45 fedora 157.8 KiB python3-matplotlib-data-fonts x86_64 0:3.11.0~rc1-2.fc45 copr_base 8.5 MiB python3-numexpr x86_64 0:2.14.1-1.fc44 fedora 859.7 KiB python3-numpy-f2py x86_64 1:2.4.4-1.fc45 fedora 2.1 MiB python3-olefile noarch 0:0.47-13.fc44 fedora 346.5 KiB python3-patsy noarch 0:1.0.2-3.fc44 fedora 2.0 MiB python3-pillow x86_64 0:12.2.0-1.fc45 fedora 4.5 MiB python3-platformdirs noarch 0:4.9.1-1.fc45 fedora 236.5 KiB python3-pooch noarch 0:1.9.0-1.fc44 fedora 657.1 KiB python3-pyparsing noarch 0:3.1.2-15.fc44 fedora 1.0 MiB python3-pytz noarch 0:2026.1-1.fc45 fedora 224.0 KiB python3-six noarch 0:1.17.0-8.fc44 fedora 118.0 KiB xml-common noarch 0:0.6.3-68.fc44 fedora 78.4 KiB Transaction Summary: Installing: 64 packages Total size of inbound packages is 88 MiB. Need to download 21 MiB. After this operation, 393 MiB extra will be used (install 393 MiB, remove 0 B). [ 1/64] python3-numpy-1:2.4.4-1.fc45.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 2/64] python3-pip-0:26.0.1-2.fc45.noa 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 3/64] python3-scipy-0:1.16.2-3.fc44.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 4/64] python3-matplotlib-0:3.11.0~rc1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 5/64] flexiblas-netlib-0:3.5.0-2.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 6/64] python3-numpy-f2py-1:2.4.4-1.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 7/64] python3-dateutil-1:2.9.0.post0- 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 8/64] libgfortran-0:16.0.1-0.11.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [ 9/64] python3-pooch-0:1.9.0-1.fc44.no 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [10/64] dejavu-sans-fonts-0:2.37-29.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [11/64] freetype-0:2.14.3-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [12/64] libqhull_r-1:8.0.2-8.fc44.x86_6 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [13/64] python3-contourpy-0:1.3.3-3.fc4 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [14/64] python3-cycler-0:0.11.0-20.fc44 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [15/64] python3-kiwisolver-0:1.5.0-1.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [16/64] python3-pyparsing-0:3.1.2-15.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [17/64] python3-matplotlib-data-fonts-0 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [18/64] flexiblas-0:3.5.0-2.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [19/64] flexiblas-openblas-openmp-0:3.5 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [20/64] libquadmath-0:16.0.1-0.11.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [21/64] python3-six-0:1.17.0-8.fc44.noa 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [22/64] python3-platformdirs-0:4.9.1-1. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [23/64] fonts-filesystem-1:5.0.0-3.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [24/64] harfbuzz-0:14.2.0-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [25/64] libpng-2:1.6.56-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [26/64] openblas-openmp-0:0.3.29-3.fc45 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [27/64] glib2-0:2.88.0-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [28/64] graphite2-0:1.3.14-20.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [29/64] libraqm-0:0.10.5-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [30/64] fribidi-0:1.0.16-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [31/64] python3-pillow-0:12.2.0-1.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [32/64] lcms2-0:2.16-7.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [33/64] libimagequant-0:4.1.0-2.fc44.x8 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [34/64] libjpeg-turbo-0:3.1.4.1-1.fc45. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [35/64] libtiff-0:4.7.1-2.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [36/64] libwebp-0:1.6.0-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [37/64] libxcb-0:1.17.0-7.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [38/64] openjpeg-0:2.5.4-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [39/64] python3-olefile-0:0.47-13.fc44. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [40/64] jbigkit-libs-0:2.1-33.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [41/64] liblerc-0:4.1.0-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [42/64] libXau-0:1.0.12-4.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [43/64] python3-cairo-0:1.28.0-5.fc44.x 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [44/64] cairo-0:1.18.4-6.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [45/64] fontconfig-0:2.17.0-4.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [46/64] libX11-0:1.8.13-1.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [47/64] libXext-0:1.3.6-5.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [48/64] libXrender-0:0.9.12-4.fc44.x86_ 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [49/64] lzo-0:2.10-16.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [50/64] pixman-0:0.46.2-3.fc44.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [51/64] default-fonts-core-sans-0:4.3-1 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [52/64] xml-common-0:0.6.3-68.fc44.noar 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [53/64] libX11-common-0:1.8.13-1.fc45.n 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [54/64] abattis-cantarell-vf-fonts-0:0. 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [55/64] google-noto-sans-vf-fonts-0:202 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [56/64] google-noto-fonts-common-0:2026 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [57/64] python3-fonttools-0:4.62.1-1.fc 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [58/64] openblas-0:0.3.29-3.fc45.x86_64 100% | 0.0 B/s | 0.0 B | 00m00s >>> Already downloaded [59/64] python3-Bottleneck-0:1.6.0-2.fc 100% | 3.6 MiB/s | 181.6 KiB | 00m00s [60/64] python3-numexpr-0:2.14.1-1.fc44 100% | 2.8 MiB/s | 232.6 KiB | 00m00s [61/64] python3-pytz-0:2026.1-1.fc45.no 100% | 843.9 KiB/s | 65.8 KiB | 00m00s [62/64] python3-patsy-0:1.0.2-3.fc44.no 100% | 1.0 MiB/s | 402.5 KiB | 00m00s >>> Status code: 404 for https://dl.fedoraproject.org/pub/fedora/linux/development/rawhide/Everything/x86_64/os/Packages/p/python3-patsy-1.0.2-3.fc44.noarch.rpm (IP: 10.16.163.49) - https://dl.fedoraproject.org/pub/fedora/linux/development/rawhide/Everything/x86_64/os/Packages/p/python3-patsy-1.0.2-3.fc44.noarch.rpm >>> Status code: 404 for https://dl.fedoraproject.org/pub/fedora/linux/development/rawhide/Everything/x86_64/os/Packages/p/python3-patsy-1.0.2-3.fc44.noarch.rpm (IP: 10.16.163.49) - https://dl.fedoraproject.org/pub/fedora/linux/development/rawhide/Everything/x86_64/os/Packages/p/python3-patsy-1.0.2-3.fc44.noarch.rpm [63/64] python3-pandas-0:2.3.3-3.fc44.x 100% | 12.7 MiB/s | 7.7 MiB | 00m01s [64/64] python3-statsmodels-0:0.14.6-4. 100% | 13.8 MiB/s | 12.1 MiB | 00m01s -------------------------------------------------------------------------------- [64/64] Total 100% | 21.5 MiB/s | 20.7 MiB | 00m01s Running transaction [ 1/66] Verify package files 100% | 192.0 B/s | 64.0 B | 00m00s [ 2/66] Prepare transaction 100% | 528.0 B/s | 64.0 B | 00m00s [ 3/66] Installing fonts-filesystem-1:5 100% | 769.5 KiB/s | 788.0 B | 00m00s [ 4/66] Installing libgfortran-0:16.0.1 100% | 341.0 MiB/s | 3.4 MiB | 00m00s [ 5/66] Installing libwebp-0:1.6.0-3.fc 100% | 237.3 MiB/s | 972.1 KiB | 00m00s [ 6/66] Installing libjpeg-turbo-0:3.1. 100% | 269.3 MiB/s | 827.4 KiB | 00m00s [ 7/66] Installing libpng-2:1.6.56-1.fc 100% | 122.5 MiB/s | 250.9 KiB | 00m00s [ 8/66] Installing dejavu-sans-fonts-0: 100% | 344.3 MiB/s | 5.5 MiB | 00m00s [ 9/66] Installing abattis-cantarell-vf 100% | 189.9 MiB/s | 194.4 KiB | 00m00s [10/66] Installing openblas-0:0.3.29-3. 100% | 110.8 MiB/s | 113.5 KiB | 00m00s [11/66] Installing openblas-openmp-0:0. 100% | 517.2 MiB/s | 44.5 MiB | 00m00s [12/66] Installing python3-fonttools-0: 100% | 268.6 MiB/s | 19.6 MiB | 00m00s [13/66] Installing google-noto-fonts-co 100% | 0.0 B/s | 18.5 KiB | 00m00s [14/66] Installing google-noto-sans-vf- 100% | 278.3 MiB/s | 1.4 MiB | 00m00s [15/66] Installing default-fonts-core-s 100% | 8.9 MiB/s | 18.2 KiB | 00m00s [16/66] Installing libX11-common-0:1.8. 100% | 62.4 MiB/s | 1.2 MiB | 00m00s [17/66] Installing xml-common-0:0.6.3-6 100% | 39.6 MiB/s | 81.1 KiB | 00m00s [18/66] Installing pixman-0:0.46.2-3.fc 100% | 234.1 MiB/s | 719.3 KiB | 00m00s [19/66] Installing lzo-0:2.10-16.fc44.x 100% | 172.3 MiB/s | 176.4 KiB | 00m00s [20/66] Installing libXau-0:1.0.12-4.fc 100% | 72.6 MiB/s | 74.3 KiB | 00m00s [21/66] Installing libxcb-0:1.17.0-7.fc 100% | 179.6 MiB/s | 1.1 MiB | 00m00s [22/66] Installing libX11-0:1.8.13-1.fc 100% | 269.6 MiB/s | 1.3 MiB | 00m00s [23/66] Installing libXext-0:1.3.6-5.fc 100% | 88.9 MiB/s | 91.1 KiB | 00m00s [24/66] Installing libXrender-0:0.9.12- 100% | 50.0 MiB/s | 51.2 KiB | 00m00s [25/66] Installing liblerc-0:4.1.0-1.fc 100% | 209.0 MiB/s | 642.0 KiB | 00m00s [26/66] Installing jbigkit-libs-0:2.1-3 100% | 116.4 MiB/s | 119.2 KiB | 00m00s [27/66] Installing libtiff-0:4.7.1-2.fc 100% | 209.1 MiB/s | 642.5 KiB | 00m00s [28/66] Installing python3-olefile-0:0. 100% | 170.8 MiB/s | 349.8 KiB | 00m00s [29/66] Installing openjpeg-0:2.5.4-3.f 100% | 227.6 MiB/s | 466.1 KiB | 00m00s [30/66] Installing libimagequant-0:4.1. 100% | 230.9 MiB/s | 709.2 KiB | 00m00s [31/66] Installing lcms2-0:2.16-7.fc44. 100% | 218.4 MiB/s | 447.3 KiB | 00m00s [32/66] Installing fribidi-0:1.0.16-4.f 100% | 9.4 MiB/s | 192.5 KiB | 00m00s [33/66] Installing graphite2-0:1.3.14-2 100% | 10.0 MiB/s | 193.6 KiB | 00m00s [34/66] Installing glib2-0:2.88.0-1.fc4 100% | 231.7 MiB/s | 15.3 MiB | 00m00s [35/66] Installing freetype-0:2.14.3-1. 100% | 224.6 MiB/s | 920.0 KiB | 00m00s [36/66] Installing harfbuzz-0:14.2.0-1. 100% | 283.7 MiB/s | 2.8 MiB | 00m00s [37/66] Installing libraqm-0:0.10.5-1.f 100% | 28.9 MiB/s | 29.6 KiB | 00m00s [38/66] Installing python3-pillow-0:12. 100% | 237.8 MiB/s | 4.5 MiB | 00m00s [39/66] Installing fontconfig-0:2.17.0- 100% | 757.1 KiB/s | 795.8 KiB | 00m01s [40/66] Installing cairo-0:1.18.4-6.fc4 100% | 261.0 MiB/s | 1.8 MiB | 00m00s [41/66] Installing python3-cairo-0:1.28 100% | 166.8 MiB/s | 512.3 KiB | 00m00s [42/66] Installing python3-platformdirs 100% | 118.9 MiB/s | 243.4 KiB | 00m00s [43/66] Installing python3-pooch-0:1.9. 100% | 131.5 MiB/s | 673.1 KiB | 00m00s [44/66] Installing python3-six-0:1.17.0 100% | 117.5 MiB/s | 120.3 KiB | 00m00s [45/66] Installing python3-dateutil-1:2 100% | 174.1 MiB/s | 891.5 KiB | 00m00s [46/66] Installing libquadmath-0:16.0.1 100% | 159.7 MiB/s | 327.1 KiB | 00m00s [47/66] Installing flexiblas-netlib-0:3 100% | 247.4 MiB/s | 16.3 MiB | 00m00s [48/66] Installing flexiblas-0:3.5.0-2. 100% | 38.2 MiB/s | 39.2 KiB | 00m00s [49/66] Installing flexiblas-openblas-o 100% | 9.8 MiB/s | 39.9 KiB | 00m00s [50/66] Installing python3-numpy-1:2.4. 100% | 255.5 MiB/s | 41.6 MiB | 00m00s [51/66] Installing python3-numpy-f2py-1 100% | 55.9 MiB/s | 2.2 MiB | 00m00s [52/66] Installing python3-scipy-0:1.16 100% | 287.2 MiB/s | 64.9 MiB | 00m00s [53/66] Installing python3-Bottleneck-0 100% | 97.0 MiB/s | 596.0 KiB | 00m00s [54/66] Installing python3-numexpr-0:2. 100% | 170.2 MiB/s | 871.4 KiB | 00m00s [55/66] Installing python3-patsy-0:1.0. 100% | 281.3 MiB/s | 2.0 MiB | 00m00s [56/66] Installing python3-contourpy-0: 100% | 169.8 MiB/s | 869.6 KiB | 00m00s [57/66] Installing python3-matplotlib-d 100% | 339.7 MiB/s | 8.5 MiB | 00m00s [58/66] Installing python3-pyparsing-0: 100% | 257.4 MiB/s | 1.0 MiB | 00m00s [59/66] Installing python3-kiwisolver-0 100% | 79.0 MiB/s | 161.9 KiB | 00m00s [60/66] Installing python3-cycler-0:0.1 100% | 39.3 MiB/s | 40.3 KiB | 00m00s [61/66] Installing libqhull_r-1:8.0.2-8 100% | 240.2 MiB/s | 492.0 KiB | 00m00s [62/66] Installing python3-pytz-0:2026. 100% | 44.8 MiB/s | 229.4 KiB | 00m00s [63/66] Installing python3-pandas-0:2.3 100% | 299.6 MiB/s | 41.9 MiB | 00m00s [64/66] Installing python3-statsmodels- 100% | 225.9 MiB/s | 54.0 MiB | 00m00s [65/66] Installing python3-matplotlib-0 100% | 319.3 MiB/s | 34.5 MiB | 00m00s [66/66] Installing python3-pip-0:26.0.1 100% | 48.9 MiB/s | 11.7 MiB | 00m00s Warning: skipped OpenPGP checks for 4 packages from repository: copr_base Complete! Building target platforms: x86_64 Building for target x86_64 setting SOURCE_DATE_EPOCH=1774483200 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.tv8Mus + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + cd seaborn-0.13.2 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir --output /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-buildrequires --dep-overrides-file /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-dep-overrides -x stats Handling flit_core >=3.2,<4 from build-system.requires Requirement satisfied: flit_core >=3.2,<4 (installed: flit_core 3.12.0) Handling numpy>=1.20,!=1.24.0 from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: numpy>=1.20,!=1.24.0 (installed: numpy 2.4.4) Handling pandas>=1.2 from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: pandas>=1.2 (installed: pandas 2.3.3) Handling matplotlib>=3.4,!=3.6.1 from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: matplotlib>=3.4,!=3.6.1 (installed: matplotlib 3.11.0rc1) Handling pytest ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest ; extra == "dev" Handling pytest-cov ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest-cov ; extra == "dev" Handling pytest-xdist ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest-xdist ; extra == "dev" Handling flake8 ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: flake8 ; extra == "dev" Handling mypy ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: mypy ; extra == "dev" Handling pandas-stubs ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pandas-stubs ; extra == "dev" Handling pre-commit ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pre-commit ; extra == "dev" Handling flit ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: flit ; extra == "dev" Handling numpydoc ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: numpydoc ; extra == "docs" Handling nbconvert ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: nbconvert ; extra == "docs" Handling ipykernel ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: ipykernel ; extra == "docs" Handling sphinx<6.0.0 ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx<6.0.0 ; extra == "docs" Handling sphinx-copybutton ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-copybutton ; extra == "docs" Handling sphinx-issues ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-issues ; extra == "docs" Handling sphinx-design ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-design ; extra == "docs" Handling pyyaml ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pyyaml ; extra == "docs" Handling pydata_sphinx_theme==0.10.0rc2 ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pydata_sphinx_theme==0.10.0rc2 ; extra == "docs" Handling scipy>=1.7 ; extra == "stats" from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: scipy>=1.7 ; extra == "stats" (installed: scipy 1.16.2) Handling statsmodels>=0.12 ; extra == "stats" from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: statsmodels>=0.12 ; extra == "stats" (installed: statsmodels 0.14.6) + cat /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-buildrequires + rm -rfv seaborn-0.13.2.dist-info/ removed 'seaborn-0.13.2.dist-info/METADATA' removed 'seaborn-0.13.2.dist-info/WHEEL' removed directory 'seaborn-0.13.2.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-seaborn-0.13.2-18.fc45.buildreqs.nosrc.rpm INFO: Going to install missing dynamic buildrequires Updating and loading repositories: fedora 100% | 86.9 KiB/s | 37.0 KiB | 00m00s Copr repository 100% | 4.6 KiB/s | 1.5 KiB | 00m00s Repositories loaded. Package "pyproject-rpm-macros-1.20.0-1.fc45.noarch" is already installed. Package "python3-devel-3.14.4-2.fc45.x86_64" is already installed. Package "python3-flit-core-3.12.0-10.fc44.noarch" is already installed. Package "python3-husl-4.0.3-38.fc44.noarch" is already installed. Package "python3-numpydoc-1.9.0-2.fc45.noarch" is already installed. Package "python3-packaging-26.1-1.fc45.noarch" is already installed. Package "python3-pandas-2.3.3-3.fc44.x86_64" is already installed. Package "python3-pip-26.0.1-2.fc45.noarch" is already installed. Package "python3-pytest-9.0.3-1.fc45.noarch" is already installed. Package "python3-scipy-1.16.2-3.fc44.x86_64" is already installed. Package "python3-statsmodels-0.14.6-4.fc45.x86_64" is already installed. Nothing to do. Building target platforms: x86_64 Building for target x86_64 setting SOURCE_DATE_EPOCH=1774483200 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.bKkS8b + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + cd seaborn-0.13.2 + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(packaging)' + echo 'python3dist(pip) >= 19' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + echo -n + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + RPM_TOXENV=py314 + FEDORA=45 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir --output /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-buildrequires --dep-overrides-file /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-dep-overrides -x stats Handling flit_core >=3.2,<4 from build-system.requires Requirement satisfied: flit_core >=3.2,<4 (installed: flit_core 3.12.0) Handling numpy>=1.20,!=1.24.0 from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: numpy>=1.20,!=1.24.0 (installed: numpy 2.4.4) Handling pandas>=1.2 from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: pandas>=1.2 (installed: pandas 2.3.3) Handling matplotlib>=3.4,!=3.6.1 from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: matplotlib>=3.4,!=3.6.1 (installed: matplotlib 3.11.0rc1) Handling pytest ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest ; extra == "dev" Handling pytest-cov ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest-cov ; extra == "dev" Handling pytest-xdist ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pytest-xdist ; extra == "dev" Handling flake8 ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: flake8 ; extra == "dev" Handling mypy ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: mypy ; extra == "dev" Handling pandas-stubs ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pandas-stubs ; extra == "dev" Handling pre-commit ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pre-commit ; extra == "dev" Handling flit ; extra == "dev" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: flit ; extra == "dev" Handling numpydoc ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: numpydoc ; extra == "docs" Handling nbconvert ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: nbconvert ; extra == "docs" Handling ipykernel ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: ipykernel ; extra == "docs" Handling sphinx<6.0.0 ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx<6.0.0 ; extra == "docs" Handling sphinx-copybutton ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-copybutton ; extra == "docs" Handling sphinx-issues ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-issues ; extra == "docs" Handling sphinx-design ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: sphinx-design ; extra == "docs" Handling pyyaml ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pyyaml ; extra == "docs" Handling pydata_sphinx_theme==0.10.0rc2 ; extra == "docs" from hook generated metadata: Requires-Dist (seaborn) Ignoring alien requirement: pydata_sphinx_theme==0.10.0rc2 ; extra == "docs" Handling scipy>=1.7 ; extra == "stats" from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: scipy>=1.7 ; extra == "stats" (installed: scipy 1.16.2) Handling statsmodels>=0.12 ; extra == "stats" from hook generated metadata: Requires-Dist (seaborn) Requirement satisfied: statsmodels>=0.12 ; extra == "stats" (installed: statsmodels 0.14.6) + cat /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-buildrequires + rm -rfv seaborn-0.13.2.dist-info/ removed 'seaborn-0.13.2.dist-info/METADATA' removed 'seaborn-0.13.2.dist-info/WHEEL' removed directory 'seaborn-0.13.2.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.MTBWj6 + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib64: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd seaborn-0.13.2 + mkdir -p /builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + VALAFLAGS=-g + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + LT_SYS_LIBRARY_PATH=/usr/lib64: + CC=gcc + CXX=g++ + TMPDIR=/builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_wheel.py /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir Processing ./. Preparing metadata (pyproject.toml): started Running command Preparing metadata (pyproject.toml) Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: seaborn Building wheel for seaborn (pyproject.toml): started Running command Building wheel for seaborn (pyproject.toml) Building wheel for seaborn (pyproject.toml): finished with status 'done' Created wheel for seaborn: filename=seaborn-0.13.2-py3-none-any.whl size=293566 sha256=90dd8c9e6c43a3ae8027884f538fce7c7975be2beb0144d2e8e939e8b9767a98 Stored in directory: /builddir/.cache/pip/wheels/84/28/e4/4b23a315e08822be8f91ea2c0625b795c62a8564fbf24b143d Successfully built seaborn + RPM_EC=0 ++ jobs -p + exit 0 Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.6NB0od + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + '[' /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT '!=' / ']' + rm -rf /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT ++ dirname /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT + mkdir -p /builddir/build/BUILD/python-seaborn-0.13.2-build + mkdir /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib64: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd seaborn-0.13.2 ++ ls /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir/seaborn-0.13.2-py3-none-any.whl ++ xargs basename --multiple ++ sed -E 's/([^-]+)-([^-]+)-.+\.whl/\1==\2/' + specifier=seaborn==0.13.2 + '[' -z seaborn==0.13.2 ']' + TMPDIR=/builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir + /usr/bin/python3 -m pip install --root /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT --prefix /usr --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir seaborn==0.13.2 Using pip 26.0.1 from /usr/lib/python3.14/site-packages/pip (python 3.14) Looking in links: /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir Processing /builddir/build/BUILD/python-seaborn-0.13.2-build/pyproject-wheeldir/seaborn-0.13.2-py3-none-any.whl Installing collected packages: seaborn Successfully installed seaborn-0.13.2 + '[' -d /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/bin ']' + rm -f /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-ghost-distinfo + site_dirs=() + '[' -d /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages ']' + site_dirs+=("/usr/lib/python3.14/site-packages") + '[' /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib64/python3.14/site-packages '!=' /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages ']' + '[' -d /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib64/python3.14/site-packages ']' + for site_dir in ${site_dirs[@]} + for distinfo in /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT$site_dir/*.dist-info + echo '%ghost %dir /usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info' + sed -i s/pip/rpm/ /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info/INSTALLER + PYTHONPATH=/usr/lib/rpm/redhat + /usr/bin/python3 -B /usr/lib/rpm/redhat/pyproject_preprocess_record.py --buildroot /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT --record /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info/RECORD --output /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-record + rm -fv /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info/RECORD removed '/builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info/RECORD' + rm -fv /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info/REQUESTED removed '/builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn-0.13.2.dist-info/REQUESTED' + '[' -f /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-dep-overrides ']' ++ wc -l /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-ghost-distinfo ++ cut -f1 '-d ' + lines=1 + '[' 1 -ne 1 ']' + RPM_FILES_ESCAPE=4.19 + /usr/bin/python3 /usr/lib/rpm/redhat/pyproject_save_files.py --output-files /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-files --output-modules /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-modules --buildroot /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT --sitelib /usr/lib/python3.14/site-packages --sitearch /usr/lib64/python3.14/site-packages --python-version 3.14 --pyproject-record /builddir/build/BUILD/python-seaborn-0.13.2-build/python-seaborn-0.13.2-18.fc45.x86_64-pyproject-record --prefix /usr seaborn + /usr/lib/rpm/check-buildroot + /usr/lib/rpm/redhat/brp-ldconfig + COMPRESS='gzip -9 -n' + COMPRESS_EXT=.gz + /usr/lib/rpm/brp-compress + /usr/lib/rpm/brp-strip /usr/bin/strip + /usr/lib/rpm/brp-strip-comment-note /usr/bin/strip /usr/bin/objdump + /usr/lib/rpm/redhat/brp-strip-lto /usr/bin/strip + /usr/lib/rpm/check-rpaths + /usr/lib/rpm/redhat/brp-mangle-shebangs + /usr/lib/rpm/brp-remove-la-files + /usr/lib/rpm/redhat/brp-python-rpm-in-distinfo + env /usr/lib/rpm/redhat/brp-python-bytecompile '' 1 0 -j2 Bytecompiling .py files below /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14 using python3.14 + /usr/lib/rpm/redhat/brp-python-hardlink + /usr/bin/add-det --brp -j2 /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/widgets.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/utils.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/relational.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/regression.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/cm.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/regression.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/rcmod.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/objects.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/miscplot.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/palettes.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/matrix.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/distributions.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/categorical.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/algorithms.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_testing.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_testing.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_statistics.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_docstrings.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_docstrings.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_compat.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/axisgrid.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_base.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/__pycache__/_base.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/kde.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/version.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/husl.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/docscrape.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/appdirs.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/colors/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/external/__pycache__/docscrape.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/colors/__pycache__/crayons.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/regression.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/order.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/colors/__pycache__/xkcd_rgb.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/density.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/counting.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/base.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/text.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_stats/__pycache__/aggregation.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/line.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/dot.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/base.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/base.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/bar.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_marks/__pycache__/area.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/rules.cpython-314.opt-1.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/rules.cpython-314.pyc: replacing with normalized version /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/scales.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/properties.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/typing.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/subplots.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/__init__.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/moves.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/groupby.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/exceptions.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/data.cpython-314.opt-1.pyc: rewriting with normalized contents /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages/seaborn/_core/__pycache__/plot.cpython-314.opt-1.pyc: rewriting with normalized contents Scanned 19 directories and 166 files, processed 61 inodes, 61 modified (14 replaced + 47 rewritten), 0 unsupported format, 0 errors + /usr/bin/linkdupes --brp /builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr Scanned 18 directories and 166 files, considered 166 files, read 3 files, linked 3 files, 0 errors sum of sizes of linked files: 0 bytes Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.s4cFI3 + umask 022 + cd /builddir/build/BUILD/python-seaborn-0.13.2-build + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -I/usr/lib64/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-specs=/usr/lib/rpm/redhat/redhat-package-notes --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib64: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd seaborn-0.13.2 + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -march=x86-64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -mtls-dialect=gnu2 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,pack-relative-relocs -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-hardened-ld-errors -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 -specs=/usr/lib/rpm/redhat/redhat-package-notes ' + PATH=/builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin + PYTHONPATH=/builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib64/python3.14/site-packages:/builddir/build/BUILD/python-seaborn-0.13.2-build/BUILDROOT/usr/lib/python3.14/site-packages + PYTHONDONTWRITEBYTECODE=1 + PYTEST_ADDOPTS=' --ignore=/builddir/build/BUILD/python-seaborn-0.13.2-build/.pyproject-builddir' + PYTEST_XDIST_AUTO_NUM_WORKERS=2 + /usr/bin/pytest --deselect tests/_core/test_plot.py::TestLabelVisibility::test_1d_column_wrapped --deselect tests/_core/test_plot.py::TestLabelVisibility::test_1d_row_wrapped --deselect tests/test_distributions.py::TestKDEPlotBivariate::test_weights ============================= test session starts ============================== platform linux -- Python 3.14.4, pytest-9.0.3, pluggy-1.6.0 rootdir: /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2 configfile: pyproject.toml collected 2380 items / 4 deselected / 2376 selected tests/_core/test_data.py ....................................s [ 1%] tests/_core/test_groupby.py ........... [ 2%] tests/_core/test_moves.py .................................. [ 3%] tests/_core/test_plot.py .................FFFFFFx.FFFF..FFFFFxFFFxFFFFFF [ 5%] FFFFFFFFFFxFxFF..FFFFFFFFFFFFFF...FFFFFFFFF.........FFFFF....FFFFFF...FF [ 8%] F.................F..FF.FFFF.F....FFFFF.FF.FFFFFFF..FFFFFFFFFFFFFFFF.... [ 11%] ........ [ 11%] tests/_core/test_properties.py ......................................... [ 13%] ........................................................................ [ 16%] .............................................. [ 18%] tests/_core/test_rules.py ... [ 18%] tests/_core/test_scales.py FFF.FFFFFFFFFFFF.FFFFFFFFFFF.FFFFFFFFFF.FFFFF [ 20%] FFFFFFFFFFxFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF [ 22%] tests/_core/test_subplots.py .................................... [ 24%] tests/_marks/test_area.py FFFFFF [ 24%] tests/_marks/test_bar.py FFFFFFFFFFFFFFFF [ 25%] tests/_marks/test_base.py ........... [ 25%] tests/_marks/test_dot.py FFFFFFFFFFFFFF [ 26%] tests/_marks/test_line.py FFFFFFFFFFFFFFFFFFFFFFFFFFFFF [ 27%] tests/_marks/test_text.py FFFFFFFF [ 28%] tests/_stats/test_aggregation.py .......... [ 28%] tests/_stats/test_counting.py ............................. [ 29%] tests/_stats/test_density.py ......................... [ 30%] tests/_stats/test_order.py ...... [ 30%] tests/_stats/test_regression.py ... [ 31%] tests/test_algorithms.py ............. [ 31%] tests/test_axisgrid.py ................................................. [ 33%] ........................................................................ [ 36%] .. [ 36%] tests/test_base.py ..................................................... [ 39%] ........................................................................ [ 42%] ........................................ [ 43%] tests/test_categorical.py .............................................. [ 45%] .....................................................................F.. [ 48%] ........................................................................ [ 51%] ...............F........................................................ [ 54%] ........................................................................ [ 57%] ........................................................................ [ 60%] ........................................................................ [ 63%] ........................................................................ [ 66%] ..................................................................... [ 69%] tests/test_distributions.py ............................................ [ 71%] ..........s....................F........................................ [ 74%] ........................................................................ [ 77%] ...................................................................... [ 80%] tests/test_docstrings.py .... [ 80%] tests/test_matrix.py ...............................................ss.. [ 82%] ........................................ [ 84%] tests/test_miscplot.py .s [ 84%] tests/test_objects.py . [ 84%] tests/test_palettes.py ...................................... [ 86%] tests/test_rcmod.py ....................s.s [ 87%] tests/test_regression.py ..................sss.......................... [ 89%] ..........ss.. [ 89%] tests/test_relational.py ............................................... [ 91%] ..........F............................................................. [ 94%] [ 94%] tests/test_statistics.py ............................................... [ 96%] ......................... [ 97%] tests/test_utils.py .............................sss.s.............. [100%] =================================== FAILURES =================================== _____________________ TestLayerAddition.test_without_data ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd703f380> args = ('x', (.identity at 0x7f3fd7610d50>, .identity at 0x7f3fd7610d50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_without_data(self, long_df): > p = Plot(long_df, x="x", y="y").add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestLayerAddition.test_with_new_variable_by_name _______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd721acf0> args = ('x', (.identity at 0x7f3fd6fda4b0>, .identity at 0x7f3fd6fda4b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_with_new_variable_by_name(self, long_df): > p = Plot(long_df, x="x").add(MockMark(), y="y").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________ TestLayerAddition.test_with_new_variable_by_vector ______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7252510> args = ('x', (.identity at 0x7f3fd72410c0>, .identity at 0x7f3fd72410c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_with_new_variable_by_vector(self, long_df): > p = Plot(long_df, x="x").add(MockMark(), y=long_df["y"]).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:213: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestLayerAddition.test_with_late_data_definition _______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd649de80> args = ('x', (.identity at 0x7f3fd71e7950>, .identity at 0x7f3fd71e7950>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_with_late_data_definition(self, long_df): > p = Plot().add(MockMark(), data=long_df, x="x", y="y").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestLayerAddition.test_with_new_data_definition ________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd65697f0> args = ('x', (.identity at 0x7f3fd6d52b90>, .identity at 0x7f3fd6d52b90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_with_new_data_definition(self, long_df): long_df_sub = long_df.sample(frac=.5) > p = Plot(long_df, x="x", y="y").add(MockMark(), data=long_df_sub).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:231: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestLayerAddition.test_drop_variable _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd70f9160> args = ('x', (.identity at 0x7f3fd714d7a0>, .identity at 0x7f3fd714d7a0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_drop_variable(self, long_df): > p = Plot(long_df, x="x", y="y").add(MockMark(), y=None).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestLayerAddition.test_orient[x-x] ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd73f8ec0> args = ('x', (.identity at 0x7f3fd717cbf0>, .identity at 0x7f3fd717cbf0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = arg = 'x', expected = 'x' @pytest.mark.parametrize( "arg,expected", [("x", "x"), ("y", "y"), ("v", "x"), ("h", "y")], ) def test_orient(self, arg, expected): class MockStatTrackOrient(Stat): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data class MockMoveTrackOrient(Move): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data s = MockStatTrackOrient() m = MockMoveTrackOrient() > Plot(x=[1, 2, 3], y=[1, 2, 3]).add(MockMark(), s, m, orient=arg).plot() tests/_core/test_plot.py:286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestLayerAddition.test_orient[y-y] ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd740cc20> args = ('x', (.identity at 0x7f3fd7483a00>, .identity at 0x7f3fd7483a00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = arg = 'y', expected = 'y' @pytest.mark.parametrize( "arg,expected", [("x", "x"), ("y", "y"), ("v", "x"), ("h", "y")], ) def test_orient(self, arg, expected): class MockStatTrackOrient(Stat): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data class MockMoveTrackOrient(Move): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data s = MockStatTrackOrient() m = MockMoveTrackOrient() > Plot(x=[1, 2, 3], y=[1, 2, 3]).add(MockMark(), s, m, orient=arg).plot() tests/_core/test_plot.py:286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestLayerAddition.test_orient[v-x] ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7338980> args = ('x', (.identity at 0x7f3fd6012cf0>, .identity at 0x7f3fd6012cf0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = arg = 'v', expected = 'x' @pytest.mark.parametrize( "arg,expected", [("x", "x"), ("y", "y"), ("v", "x"), ("h", "y")], ) def test_orient(self, arg, expected): class MockStatTrackOrient(Stat): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data class MockMoveTrackOrient(Move): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data s = MockStatTrackOrient() m = MockMoveTrackOrient() > Plot(x=[1, 2, 3], y=[1, 2, 3]).add(MockMark(), s, m, orient=arg).plot() tests/_core/test_plot.py:286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestLayerAddition.test_orient[h-y] ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f386e0> args = ('x', (.identity at 0x7f3fd732e1f0>, .identity at 0x7f3fd732e1f0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = arg = 'h', expected = 'y' @pytest.mark.parametrize( "arg,expected", [("x", "x"), ("y", "y"), ("v", "x"), ("h", "y")], ) def test_orient(self, arg, expected): class MockStatTrackOrient(Stat): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data class MockMoveTrackOrient(Move): def __call__(self, data, groupby, orient, scales): self.orient_at_call = orient return data s = MockStatTrackOrient() m = MockMoveTrackOrient() > Plot(x=[1, 2, 3], y=[1, 2, 3]).add(MockMark(), s, m, orient=arg).plot() tests/_core/test_plot.py:286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________________ TestScaling.test_inference __________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f3a270> args = ('x', (.identity at 0x7f3fd738e8d0>, .identity at 0x7f3fd738e8d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_inference(self, long_df): for col, scale_type in zip("zat", ["Continuous", "Nominal", "Temporal"]): > p = Plot(long_df, x=col, y=col).add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestScaling.test_inference_from_layer_data __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd703c830> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_inference_from_layer_data(self): > p = Plot().add(MockMark(), x=["a", "b", "c"]).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:350: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestScaling.test_inference_joins _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd7269750>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f3a7b0> args = ('y', (.identity at 0x7f3fd74212d0>, .identity at 0x7f3fd74212d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_inference_joins(self): p = ( Plot(y=pd.Series([1, 2, 3, 4])) .add(MockMark(), x=pd.Series([1, 2])) .add(MockMark(), x=pd.Series(["a", "b"], index=[2, 3])) > .plot() ^^^^^^ ) tests/_core/test_plot.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd7269750>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestScaling.test_inferred_categorical_converter ________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd72da270> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_inferred_categorical_converter(self): > p = Plot(x=["b", "c", "a"]).add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestScaling.test_explicit_categorical_converter ________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd7241be0> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_explicit_categorical_converter(self): > p = Plot(y=[2, 1, 3]).scale(y=Nominal()).add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:371: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestScaling.test_faceted_log_scale ______________________ self = p = common = , layers = [] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd73e0830> args = ('y', (.log at 0x7f3fd71e6c40>, .exp at 0x7f3fd72fa770>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_faceted_log_scale(self): > p = Plot(y=[1, 10]).facet(col=["a", "b"]).scale(y="log").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:385: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestScaling.test_paired_single_log_scale ___________________ self = p = common = , layers = [] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd71470e0> args = ('x0', (.identity at 0x7f3fd7467950>, .identity at 0x7f3fd7467950>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_paired_single_log_scale(self): x0, x1 = [1, 2, 3], [1, 10, 100] > p = Plot().pair(x=[x0, x1]).scale(x1="log").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:393: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestScaling.test_paired_with_common_fallback _________________ self = p = common = , layers = [] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6675a90> args = ('x0', (.forward at 0x7f3fd6f00300>, .inverse at 0x7f3fd6f003b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_paired_with_common_fallback(self): x0, x1 = [1, 2, 3], [1, 10, 100] > p = Plot().pair(x=[x0, x1]).scale(x="pow", x1="log").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________ TestScaling.test_mark_data_log_transform_is_inverted _____________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd65f9d30> args = ('x', (.log at 0x7f3fd60c54e0>, .exp at 0x7f3fd7465170>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_mark_data_log_transform_is_inverted(self, long_df): col = "z" m = MockMark() > Plot(long_df, x=col).scale(x="log").add(m).plot() tests/_core/test_plot.py:422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________ TestScaling.test_mark_data_log_transfrom_with_stat ______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd737d7f0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_mark_data_log_transfrom_with_stat(self, long_df): class Mean(Stat): group_by_orient = True def __call__(self, data, groupby, orient, scales): other = {"x": "y", "y": "x"}[orient] return groupby.agg(data, {other: "mean"}) col = "z" grouper = "a" m = MockMark() s = Mean() > Plot(long_df, x=grouper, y=col).scale(y="log").add(m, s).plot() tests/_core/test_plot.py:439: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestScaling.test_mark_data_from_categorical __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd71b1160> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_mark_data_from_categorical(self, long_df): col = "a" m = MockMark() > Plot(long_df, x=col).add(m).plot() tests/_core/test_plot.py:455: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestScaling.test_mark_data_from_datetime ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd71b1a90> args = ('x', (.identity at 0x7f3fd6010670>, .identity at 0x7f3fd6010670>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_mark_data_from_datetime(self, long_df): col = "t" m = MockMark() > Plot(long_df, x=col).add(m).plot() tests/_core/test_plot.py:465: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestScaling.test_computed_var_ticks ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7339550> args = ('x', (.identity at 0x7f3fd732c9e0>, .identity at 0x7f3fd732c9e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_computed_var_ticks(self, long_df): class Identity(Stat): def __call__(self, df, groupby, orient, scales): other = {"x": "y", "y": "x"}[orient] return df.assign(**{other: df[orient]}) tick_locs = [1, 2, 5] scale = Continuous().tick(at=tick_locs) > p = Plot(long_df, "x").add(MockMark(), Identity()).scale(y=scale).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:479: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestScaling.test_computed_var_transform ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd72e1010> args = ('x', (.identity at 0x7f3fd6fdbed0>, .identity at 0x7f3fd6fdbed0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_computed_var_transform(self, long_df): class Identity(Stat): def __call__(self, df, groupby, orient, scales): other = {"x": "y", "y": "x"}[orient] return df.assign(**{other: df[orient]}) > p = Plot(long_df, "x").add(MockMark(), Identity()).scale(y="log").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:490: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________ TestScaling.test_explicit_range_with_axis_scaling _______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6568980> args = ('x', (.identity at 0x7f3fd717de80>, .identity at 0x7f3fd717de80>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_explicit_range_with_axis_scaling(self): x = [1, 2, 3] ymin = [10, 100, 1000] ymax = [20, 200, 2000] m = MockMark() > Plot(x=x, ymin=ymin, ymax=ymax).add(m).scale(y="log").plot() tests/_core/test_plot.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestScaling.test_derived_range_with_axis_scaling _______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6584440> args = ('x', (.identity at 0x7f3fd7104250>, .identity at 0x7f3fd7104250>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_derived_range_with_axis_scaling(self): class AddOne(Stat): def __call__(self, df, *args): return df.assign(ymax=df["y"] + 1) x = y = [1, 10, 100] m = MockMark() > Plot(x, y).add(m, AddOne()).scale(y="log").plot() tests/_core/test_plot.py:513: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestScaling.test_facet_categories _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd70a2cf0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_facet_categories(self): m = MockMark() > p = Plot(x=["a", "b", "a", "c"]).facet(col=["x", "x", "y", "y"]).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:519: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestScaling.test_facet_categories_unshared __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6c496a0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_facet_categories_unshared(self): m = MockMark() p = ( Plot(x=["a", "b", "a", "c"]) .facet(col=["x", "x", "y", "y"]) .share(x=False) .add(m) > .plot() ^^^^^^ ) tests/_core/test_plot.py:534: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________ TestScaling.test_facet_categories_single_dim_shared ______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6d3e7b0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_facet_categories_single_dim_shared(self): data = [ ("a", 1, 1), ("b", 1, 1), ("a", 1, 2), ("c", 1, 2), ("b", 2, 1), ("d", 2, 1), ("e", 2, 2), ("e", 2, 1), ] df = pd.DataFrame(data, columns=["x", "row", "col"]).assign(y=1) m = MockMark() p = ( Plot(df, x="x") .facet(row="row", col="col") .add(m) .share(x="row") > .plot() ^^^^^^ ) tests/_core/test_plot.py:557: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestScaling.test_pair_categories _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5c15010> args = ('y', (.identity at 0x7f3fd71e4460>, .identity at 0x7f3fd71e4460>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_pair_categories(self): data = [("a", "a"), ("b", "c")] df = pd.DataFrame(data, columns=["x1", "x2"]).assign(y=1) m = MockMark() > p = Plot(df, y="y").pair(x=["x1", "x2"]).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:574: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestScaling.test_pair_categories_shared ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd73138c0> args = ('y', (.identity at 0x7f3fd717c3b0>, .identity at 0x7f3fd717c3b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_pair_categories_shared(self): data = [("a", "a"), ("b", "c")] df = pd.DataFrame(data, columns=["x1", "x2"]).assign(y=1) m = MockMark() > p = Plot(df, y="y").pair(x=["x1", "x2"]).add(m).share(x=True).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestScaling.test_identity_mapping_linewidth __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6677a10> args = ('x', (.identity at 0x7f3fd60c4880>, .identity at 0x7f3fd60c4880>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_identity_mapping_linewidth(self): m = MockMark() x = y = [1, 2, 3, 4, 5] lw = pd.Series([.5, .1, .1, .9, 3]) > Plot(x=x, y=y, linewidth=lw).scale(linewidth=None).add(m).plot() tests/_core/test_plot.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________ TestScaling.test_pair_single_coordinate_stat_orient ______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd703dfd0> args = ('x0', (.identity at 0x7f3fd654c5c0>, .identity at 0x7f3fd654c5c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_pair_single_coordinate_stat_orient(self, long_df): class MockStat(Stat): def __call__(self, data, groupby, orient, scales): self.orient = orient return data s = MockStat() > Plot(long_df).pair(x=["x", "y"]).add(MockMark(), s).plot() tests/_core/test_plot.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestScaling.test_inferred_nominal_passed_to_stat _______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6f3b8c0> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_inferred_nominal_passed_to_stat(self): class MockStat(Stat): def __call__(self, data, groupby, orient, scales): self.scales = scales return data s = MockStat() y = ["a", "a", "b", "c"] > Plot(y=y).add(MockMark(), s).plot() tests/_core/test_plot.py:623: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestScaling.test_identity_mapping_color_tuples ________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd70a2120> args = ('x', (.identity at 0x7f3fd6fd94e0>, .identity at 0x7f3fd6fd94e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_identity_mapping_color_tuples(self): m = MockMark() x = y = [1, 2, 3] c = [(1, 0, 0), (0, 1, 0), (1, 0, 0)] > Plot(x=x, y=y, color=c).scale(color=None).add(m).plot() tests/_core/test_plot.py:644: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestScaling.test_nominal_x_axis_tweaks ____________________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd655d400> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_nominal_x_axis_tweaks(self): p = Plot(x=["a", "b", "c"], y=[1, 2, 3]) > ax1 = p.plot()._figure.axes[0] ^^^^^^^^ tests/_core/test_plot.py:661: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestScaling.test_nominal_y_axis_tweaks ____________________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5dd4d70> args = ('x', (.identity at 0x7f3fd6bb3060>, .identity at 0x7f3fd6bb3060>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_nominal_y_axis_tweaks(self): p = Plot(x=[1, 2, 3], y=["a", "b", "c"]) > ax1 = p.plot()._figure.axes[0] ^^^^^^^^ tests/_core/test_plot.py:672: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestPlotting.test_no_orient_variance _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd65f9940> args = ('x', (.identity at 0x7f3fd6bd7ed0>, .identity at 0x7f3fd6bd7ed0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_no_orient_variance(self): x, y = [0, 0], [1, 2] m = MockMark() > Plot(x, y).add(m).plot() tests/_core/test_plot.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestPlotting.test_single_split_single_layer __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c4bcb0> args = ('x', (.identity at 0x7f3fd732e610>, .identity at 0x7f3fd732e610>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_single_split_single_layer(self, long_df): m = MockMark() > p = Plot(long_df, x="f", y="z").add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:708: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestPlotting.test_single_split_multi_layer __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestPlotting.... None, 'legend': True, 'mark': TestPlotting.test_single_split_multi_layer..NoGroupingMark(artist_kws={}), ...}] variables = ['color', 'linewidth', 'color', 'pattern'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd73138c0> args = ('color',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_single_split_multi_layer(self, long_df): vs = [{"color": "a", "linewidth": "z"}, {"color": "b", "pattern": "c"}] class NoGroupingMark(MockMark): _grouping_props = [] ms = [NoGroupingMark(), NoGroupingMark()] > Plot(long_df).add(ms[0], **vs[0]).add(ms[1], **vs[1]).plot() tests/_core/test_plot.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:953: in _plot plotter._setup_scales(self, common, layers) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestPlotting.... None, 'legend': True, 'mark': TestPlotting.test_single_split_multi_layer..NoGroupingMark(artist_kws={}), ...}] variables = ['color', 'linewidth', 'color', 'pattern'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `color` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestPlotting.test_one_grouping_variable[color] ________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd737e510> args = ('x', (.identity at 0x7f3fd64f1430>, .identity at 0x7f3fd64f1430>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] split_var = 'color' @pytest.mark.parametrize( "split_var", [ "color", # explicitly declared on the Mark "group", # implicitly used for all Mark classes ]) def test_one_grouping_variable(self, long_df, split_var): split_col = "a" data_vars = {"x": "f", "y": "z", split_var: split_col} m = MockMark() > p = Plot(long_df, **data_vars).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:775: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestPlotting.test_one_grouping_variable[group] ________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c492b0> args = ('x', (.identity at 0x7f3fd72f8880>, .identity at 0x7f3fd72f8880>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] split_var = 'group' @pytest.mark.parametrize( "split_var", [ "color", # explicitly declared on the Mark "group", # implicitly used for all Mark classes ]) def test_one_grouping_variable(self, long_df, split_var): split_col = "a" data_vars = {"x": "f", "y": "z", split_var: split_col} m = MockMark() > p = Plot(long_df, **data_vars).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:775: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestPlotting.test_two_grouping_variables ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd65f81a0> args = ('y', (.identity at 0x7f3fd5bdddd0>, .identity at 0x7f3fd5bdddd0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_two_grouping_variables(self, long_df): split_vars = ["color", "group"] split_cols = ["a", "b"] data_vars = {"y": "z", **{var: col for var, col in zip(split_vars, split_cols)}} m = MockMark() > p = Plot(long_df, **data_vars).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:791: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPlotting.test_specified_width _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7322660> args = ('x', (.identity at 0x7f3fd6568510>, .identity at 0x7f3fd6568510>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_specified_width(self, long_df): m = MockMark() > Plot(long_df, x="x", y="y").add(m, width="z").plot() tests/_core/test_plot.py:805: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestPlotting.test_facets_no_subgroups _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6bf97f0> args = ('x', (.identity at 0x7f3fd6010b40>, .identity at 0x7f3fd6010b40>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_facets_no_subgroups(self, long_df): split_var = "col" split_col = "b" data_vars = {"x": "f", "y": "z"} m = MockMark() > p = Plot(long_df, **data_vars).facet(**{split_var: split_col}).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestPlotting.test_facets_one_subgroup _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ccdd30> args = ('x', (.identity at 0x7f3fd6b49f30>, .identity at 0x7f3fd6b49f30>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_facets_one_subgroup(self, long_df): facet_var, facet_col = fx = "col", "a" group_var, group_col = gx = "group", "b" split_vars, split_cols = zip(*[fx, gx]) data_vars = {"x": "f", "y": "z", group_var: group_col} m = MockMark() p = ( Plot(long_df, **data_vars) .facet(**{facet_var: facet_col}) .add(m) > .plot() ^^^^^^ ) tests/_core/test_plot.py:835: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestPlotting.test_layer_specific_facet_disabling _______________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ade270> args = ('x', (.identity at 0x7f3fd5be8300>, .identity at 0x7f3fd5be8300>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_layer_specific_facet_disabling(self, long_df): axis_vars = {"x": "y", "y": "z"} row_var = "a" m = MockMark() > p = Plot(long_df, **axis_vars).facet(row=row_var).add(m, row=None).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:854: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPlotting.test_paired_variables ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f13380> args = ('x0', (.identity at 0x7f3fd600f690>, .identity at 0x7f3fd600f690>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_paired_variables(self, long_df): x = ["x", "y"] y = ["f", "z"] m = MockMark() > Plot(long_df).pair(x, y).add(m).plot() tests/_core/test_plot.py:869: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestPlotting.test_paired_one_dimension ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd71b3770> args = ('x0', (.identity at 0x7f3fd6b4a350>, .identity at 0x7f3fd6b4a350>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_paired_one_dimension(self, long_df): x = ["y", "z"] m = MockMark() > Plot(long_df).pair(x).add(m).plot() tests/_core/test_plot.py:882: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestPlotting.test_paired_variables_one_subset _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f3acf0> args = ('x0', (.identity at 0x7f3fd5be8300>, .identity at 0x7f3fd5be8300>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12.0 0.449243 6.611886 b p ... 2 0.2 ...8 0.3 a 8 8 99 15.0 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_paired_variables_one_subset(self, long_df): x = ["x", "y"] y = ["f", "z"] group = "a" long_df["x"] = long_df["x"].astype(float) # simplify vector comparison m = MockMark() > Plot(long_df, group=group).pair(x, y).add(m).plot() tests/_core/test_plot.py:896: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestPlotting.test_paired_and_faceted _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f041a0> args = ('y', (.identity at 0x7f3fd6010670>, .identity at 0x7f3fd6010670>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_paired_and_faceted(self, long_df): x = ["y", "z"] y = "f" row = "c" m = MockMark() > Plot(long_df, y=y).facet(row=row).pair(x).add(m).plot() tests/_core/test_plot.py:913: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPlotting.test_theme_validation ______________________ self = def test_theme_validation(self): p = Plot() # You'd think matplotlib would raise a TypeError here, but it doesn't with pytest.raises(ValueError, match="Key axes.linewidth:"): p.theme({"axes.linewidth": "thick"}) with pytest.raises(KeyError, match="not.a.key is not a valid rc"): > p.theme({"not.a.key": True}) tests/_core/test_plot.py:948: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:884: in theme rc = mpl.RcParams(config) ^^^^^^^^^^^^^^^^^^^^ /usr/lib64/python3.14/site-packages/matplotlib/__init__.py:700: in __init__ self.update(*args, **kwargs) :971: in update ??? /usr/lib64/python3.14/site-packages/matplotlib/__init__.py:777: in __setitem__ valid_key = _api.getitem_checked( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mapping = {'_internal.classic_mode': , 'agg.path.chunksize': , 'animation.codec': .validate_str at 0x7f3fe08509e0>, ...} _error_cls = , kwargs = {'rcParam': 'not.a.key'} k = 'rcParam', v = 'not.a.key' def getitem_checked(mapping, /, _error_cls=ValueError, **kwargs): """ *kwargs* must consist of a single *key, value* pair. If *key* is in *mapping*, return ``mapping[value]``; else, raise an appropriate ValueError. Parameters ---------- _error_cls : Class of error to raise. Examples -------- >>> _api.getitem_checked({"foo": "bar"}, arg=arg) """ if len(kwargs) != 1: raise ValueError("getitem_checked takes a single keyword argument") (k, v), = kwargs.items() try: return mapping[v] except KeyError: > raise _error_cls(list_suggestion_error_msg(k, v, mapping.keys())) from None E KeyError: "'not.a.key' is not a valid value for rcParam. Did you mean: 'font.family'?" /usr/lib64/python3.14/site-packages/matplotlib/_api/__init__.py:286: KeyError During handling of the above exception, another exception occurred: self = def test_theme_validation(self): p = Plot() # You'd think matplotlib would raise a TypeError here, but it doesn't with pytest.raises(ValueError, match="Key axes.linewidth:"): p.theme({"axes.linewidth": "thick"}) > with pytest.raises(KeyError, match="not.a.key is not a valid rc"): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E AssertionError: Regex pattern did not match. E Expected regex: 'not.a.key is not a valid rc' E Actual message: '"\'not.a.key\' is not a valid value for rcParam. Did you mean: \'font.family\'?"' tests/_core/test_plot.py:947: AssertionError ____________________________ TestPlotting.test_stat ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6642ba0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_stat(self, long_df): orig_df = long_df.copy(deep=True) m = MockMark() > Plot(long_df, x="a", y="z").add(m, Agg()).plot() tests/_core/test_plot.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestPlotting.test_move ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd72ca510> args = ('x', (.identity at 0x7f3fd5bde8d0>, .identity at 0x7f3fd5bde8d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_move(self, long_df): orig_df = long_df.copy(deep=True) m = MockMark() > Plot(long_df, x="z", y="z").add(m, Shift(x=1)).plot() tests/_core/test_plot.py:967: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPlotting.test_stat_and_move ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6541e80> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_stat_and_move(self, long_df): m = MockMark() > Plot(long_df, x="a", y="z").add(m, Agg(), Shift(y=1)).plot() tests/_core/test_plot.py:976: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPlotting.test_stat_log_scale _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6cb57f0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_stat_log_scale(self, long_df): orig_df = long_df.copy(deep=True) m = MockMark() > Plot(long_df, x="a", y="z").add(m, Agg()).scale(y="log").plot() tests/_core/test_plot.py:986: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPlotting.test_move_log_scale _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5bed160> args = ('x', (.log at 0x7f3fd6d047d0>, .exp at 0x7f3fd5c10670>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_move_log_scale(self, long_df): m = MockMark() Plot( long_df, x="z", y="z" > ).scale(x="log").add(m, Shift(x=-1)).plot() ^^^^^^ tests/_core/test_plot.py:1000: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestPlotting.test_multi_move _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f07770> args = ('x', (.identity at 0x7f3fd6ed0250>, .identity at 0x7f3fd6ed0250>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_multi_move(self, long_df): m = MockMark() move_stack = [Shift(1), Shift(2)] > Plot(long_df, x="x", y="y").add(m, *move_stack).plot() tests/_core/test_plot.py:1007: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestPlotting.test_multi_move_with_pairing ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f13230> args = ('x', (.identity at 0x7f3fd71e5dd0>, .identity at 0x7f3fd71e5dd0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_multi_move_with_pairing(self, long_df): m = MockMark() move_stack = [Shift(1), Shift(2)] > Plot(long_df, x="x").pair(y=["y", "z"]).add(m, *move_stack).plot() tests/_core/test_plot.py:1013: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPlotting.test_move_with_range _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6addd30> args = ('x', (.identity at 0x7f3fd5e885c0>, .identity at 0x7f3fd5e885c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_move_with_range(self, long_df): x = [0, 0, 1, 1, 2, 2] group = [0, 1, 0, 1, 0, 1] ymin = np.arange(6) ymax = np.arange(6) * 2 m = MockMark() > Plot(x=x, group=group, ymin=ymin, ymax=ymax).add(m, Dodge()).plot() tests/_core/test_plot.py:1025: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________________ TestPlotting.test_on_axes ___________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f0c440> args = ('x', (.identity at 0x7f3fd65352d0>, .identity at 0x7f3fd65352d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_on_axes(self): ax = mpl.figure.Figure().subplots() m = MockMark() > p = Plot([1], [2]).on(ax).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPlotting.test_on_figure[True] _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5fdacf0> args = ('x', (.identity at 0x7f3fd6b71a60>, .identity at 0x7f3fd6b71a60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = facet = True @pytest.mark.parametrize("facet", [True, False]) def test_on_figure(self, facet): f = mpl.figure.Figure() m = MockMark() p = Plot([1, 2], [3, 4]).on(f).add(m) if facet: p = p.facet(["a", "b"]) > p = p.plot() ^^^^^^^^ tests/_core/test_plot.py:1136: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPlotting.test_on_figure[False] ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd60d6660> args = ('x', (.identity at 0x7f3fd6170930>, .identity at 0x7f3fd6170930>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = facet = False @pytest.mark.parametrize("facet", [True, False]) def test_on_figure(self, facet): f = mpl.figure.Figure() m = MockMark() p = Plot([1, 2], [3, 4]).on(f).add(m) if facet: p = p.facet(["a", "b"]) > p = p.plot() ^^^^^^^^ tests/_core/test_plot.py:1136: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestPlotting.test_on_subfigure[True] _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd640e120> args = ('x', (.identity at 0x7f3fd618b740>, .identity at 0x7f3fd618b740>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = facet = True @pytest.mark.parametrize("facet", [True, False]) def test_on_subfigure(self, facet): sf1, sf2 = mpl.figure.Figure().subfigures(2) sf1.subplots() m = MockMark() p = Plot([1, 2], [3, 4]).on(sf2).add(m) if facet: p = p.facet(["a", "b"]) > p = p.plot() ^^^^^^^^ tests/_core/test_plot.py:1149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestPlotting.test_on_subfigure[False] _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6262a50> args = ('x', (.identity at 0x7f3fd620cd50>, .identity at 0x7f3fd620cd50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = facet = False @pytest.mark.parametrize("facet", [True, False]) def test_on_subfigure(self, facet): sf1, sf2 = mpl.figure.Figure().subfigures(2) sf1.subplots() m = MockMark() p = Plot([1, 2], [3, 4]).on(sf2).add(m) if facet: p = p.facet(["a", "b"]) > p = p.plot() ^^^^^^^^ tests/_core/test_plot.py:1149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestPlotting.test_axis_labels_from_constructor ________________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6322510> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_axis_labels_from_constructor(self, long_df): > ax, = Plot(long_df, x="a", y="b").plot()._figure.axes ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestPlotting.test_axis_labels_from_layer ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd5bef4d0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_axis_labels_from_layer(self, long_df): m = MockMark() > ax, = Plot(long_df).add(m, x="a", y="b").plot()._figure.axes ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestPlotting.test_axis_labels_are_first_name _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd5b33d50>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6cb7770> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_axis_labels_are_first_name(self, long_df): m = MockMark() p = ( Plot(long_df, x=long_df["z"].to_list(), y="b") .add(m, x="a") .add(m, x="x", y="y") ) > ax, = p.plot()._figure.axes ^^^^^^^^ tests/_core/test_plot.py:1227: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd5b33d50>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestPlotting.test_limits ___________________________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702d400> args = ('x', (.identity at 0x7f3fd7684300>, .identity at 0x7f3fd7684300>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_limits(self, long_df): limit = (-2, 24) > p = Plot(long_df, x="x", y="y").limit(x=limit).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1234: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPlotting.test_labels_axis _________________________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd737e7b0> args = ('x', (.identity at 0x7f3fd732d4e0>, .identity at 0x7f3fd732d4e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_labels_axis(self, long_df): label = "Y axis" > p = Plot(long_df, x="x", y="y").label(y=label).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPlotting.test_labels_legend ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd737d2b0> args = ('x', (.identity at 0x7f3fd64f14e0>, .identity at 0x7f3fd64f14e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_labels_legend(self, long_df): m = MockMark() label = "A" > p = Plot(long_df, x="x", y="y", color="a").add(m).label(color=label).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1265: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestExceptions.test_scale_setup ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6c2c2f0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_scale_setup(self): x = y = color = ["a", "b"] bad_palette = "not_a_palette" p = Plot(x, y, color=color).add(MockMark()).scale(color=bad_palette) msg = "Scale setup failed for the `color` variable." with pytest.raises(PlotSpecError, match=msg) as err: > p.plot() tests/_core/test_plot.py:1311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError During handling of the above exception, another exception occurred: self = def test_scale_setup(self): x = y = color = ["a", "b"] bad_palette = "not_a_palette" p = Plot(x, y, color=color).add(MockMark()).scale(color=bad_palette) msg = "Scale setup failed for the `color` variable." > with pytest.raises(PlotSpecError, match=msg) as err: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E AssertionError: Regex pattern did not match. E Expected regex: 'Scale setup failed for the `color` variable.' E Actual message: 'Scale setup failed for the `x` variable. See the traceback above for more information.' tests/_core/test_plot.py:1310: AssertionError ____________________ TestExceptions.test_coordinate_scaling ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c2fb60> args = ('x', (.identity at 0x7f3fd6171640>, .identity at 0x7f3fd6171640>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_coordinate_scaling(self): x = ["a", "b"] y = [1, 2] p = Plot(x, y).add(MockMark()).scale(x=Temporal()) msg = "Scaling operation failed for the `x` variable." with pytest.raises(PlotSpecError, match=msg) as err: > p.plot() tests/_core/test_plot.py:1323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError During handling of the above exception, another exception occurred: self = def test_coordinate_scaling(self): x = ["a", "b"] y = [1, 2] p = Plot(x, y).add(MockMark()).scale(x=Temporal()) msg = "Scaling operation failed for the `x` variable." > with pytest.raises(PlotSpecError, match=msg) as err: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E AssertionError: Regex pattern did not match. E Expected regex: 'Scaling operation failed for the `x` variable.' E Actual message: 'Scale setup failed for the `x` variable. See the traceback above for more information.' tests/_core/test_plot.py:1322: AssertionError _____________________ TestExceptions.test_semantic_scaling _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5ef7770> args = ('x', (.identity at 0x7f3fd6536140>, .identity at 0x7f3fd6536140>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_semantic_scaling(self): class ErrorRaising(Continuous): def _setup(self, data, prop, axis=None): def f(x): raise ValueError("This is a test") new = super()._setup(data, prop, axis) new._pipeline = [f] return new x = y = color = [1, 2] p = Plot(x, y, color=color).add(Dot()).scale(color=ErrorRaising()) msg = "Scaling operation failed for the `color` variable." with pytest.raises(PlotSpecError, match=msg) as err: > p.plot() tests/_core/test_plot.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError During handling of the above exception, another exception occurred: self = def test_semantic_scaling(self): class ErrorRaising(Continuous): def _setup(self, data, prop, axis=None): def f(x): raise ValueError("This is a test") new = super()._setup(data, prop, axis) new._pipeline = [f] return new x = y = color = [1, 2] p = Plot(x, y, color=color).add(Dot()).scale(color=ErrorRaising()) msg = "Scaling operation failed for the `color` variable." > with pytest.raises(PlotSpecError, match=msg) as err: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E AssertionError: Regex pattern did not match. E Expected regex: 'Scaling operation failed for the `color` variable.' E Actual message: 'Scale setup failed for the `x` variable. See the traceback above for more information.' tests/_core/test_plot.py:1343: AssertionError ___________________ TestFacetInterface.test_unshared_spacing ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd534fe00> args = ('x', (.identity at 0x7f3fd5322820>, .identity at 0x7f3fd5322820>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_unshared_spacing(self): x = [1, 2, 10, 20] y = [1, 2, 3, 4] col = [1, 1, 2, 2] m = MockMark() > Plot(x, y).facet(col).add(m).share(x=False).plot() tests/_core/test_plot.py:1495: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestPairInterface.test_all_numeric[list] ___________________ self = p = common = , layers = [] variables = ['x0', 'x1', 'x2', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4fd97f0> args = ('x0', (.identity at 0x7f3fd4d14eb0>, .identity at 0x7f3fd4d14eb0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] vector_type = @pytest.mark.parametrize("vector_type", [list, pd.Index]) def test_all_numeric(self, long_df, vector_type): x, y = ["x", "y", "z"], ["s", "f"] > p = Plot(long_df).pair(vector_type(x), vector_type(y)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1539: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1', 'x2', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestPairInterface.test_all_numeric[Index] ___________________ self = p = common = , layers = [] variables = ['x0', 'x1', 'x2', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b48ec0> args = ('x0', (.identity at 0x7f3fd4bb7740>, .identity at 0x7f3fd4bb7740>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] vector_type = @pytest.mark.parametrize("vector_type", [list, pd.Index]) def test_all_numeric(self, long_df, vector_type): x, y = ["x", "y", "z"], ["s", "f"] > p = Plot(long_df).pair(vector_type(x), vector_type(y)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1539: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1', 'x2', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestPairInterface.test_single_dimension[x] __________________ self = p = common = , layers = [] variables = ['x0', 'x1', 'x2'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5e35400> args = ('x0', (.identity at 0x7f3fd5321010>, .identity at 0x7f3fd5321010>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] dim = 'x' @pytest.mark.parametrize("dim", ["x", "y"]) def test_single_dimension(self, long_df, dim): variables = {"x": None, "y": None} variables[dim] = ["x", "y", "z"] > p = Plot(long_df).pair(**variables).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1554: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1', 'x2'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestPairInterface.test_single_dimension[y] __________________ self = p = common = , layers = [] variables = ['y0', 'y1', 'y2'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c4b620> args = ('y0', (.identity at 0x7f3fd4bb7690>, .identity at 0x7f3fd4bb7690>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] dim = 'y' @pytest.mark.parametrize("dim", ["x", "y"]) def test_single_dimension(self, long_df, dim): variables = {"x": None, "y": None} variables[dim] = ["x", "y", "z"] > p = Plot(long_df).pair(**variables).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1554: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y0', 'y1', 'y2'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPairInterface.test_non_cross _______________________ self = p = common = , layers = [] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7313a10> args = ('x0', (.identity at 0x7f3fd65bf480>, .identity at 0x7f3fd65bf480>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_non_cross(self, long_df): x = ["x", "y"] y = ["f", "z"] > p = Plot(long_df).pair(x, y, cross=False).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestPairInterface.test_list_of_vectors ____________________ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f07380> args = ('y', (.identity at 0x7f3fd7173690>, .identity at 0x7f3fd7173690>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_list_of_vectors(self, long_df): x_vars = ["x", "z"] > p = Plot(long_df, y="y").pair(x=[long_df[x] for x in x_vars]).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1579: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPairInterface.test_with_facets ______________________ self = p = common = , layers = [] variables = ['x', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5de6660> args = ('x', (.identity at 0x7f3fd5f7c880>, .identity at 0x7f3fd5f7c880>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_with_facets(self, long_df): x = "x" y = ["y", "z"] col = "a" > p = Plot(long_df, x=x).facet(col).pair(y=y).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestPairInterface.test_axis_sharing ______________________ self = p = common = , layers = [] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd5c43770> args = ('x0',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_axis_sharing(self, long_df): p = Plot(long_df).pair(x=["a", "b"], y=["y", "z"]) shape = 2, 2 > p1 = p.plot() ^^^^^^^^ tests/_core/test_plot.py:1635: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x0', 'x1', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x0` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________ TestPairInterface.test_axis_sharing_with_facets ________________ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f38980> args = ('y', (.identity at 0x7f3fd6ab4a90>, .identity at 0x7f3fd6ab4a90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_axis_sharing_with_facets(self, long_df): > p = Plot(long_df, y="y").pair(x=["a", "b"]).facet(row="c").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPairInterface.test_x_wrapping _______________________ self = p = common = , layers = [] variables = ['y', 'x0', 'x1', 'x2', 'x3'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6279a90> args = ('y', (.identity at 0x7f3fd62094e0>, .identity at 0x7f3fd62094e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_x_wrapping(self, long_df): x_vars = ["f", "x", "y", "z"] wrap = 3 > p = Plot(long_df, y="y").pair(x=x_vars, wrap=wrap).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1679: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y', 'x0', 'x1', 'x2', 'x3'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPairInterface.test_y_wrapping _______________________ self = p = common = , layers = [] variables = ['x', 'y0', 'y1', 'y2', 'y3'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6469be0> args = ('x', (.identity at 0x7f3fd5fda980>, .identity at 0x7f3fd5fda980>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_y_wrapping(self, long_df): y_vars = ["f", "x", "y", "z"] wrap = 3 > p = Plot(long_df, x="x").pair(y=y_vars, wrap=wrap).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1692: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y0', 'y1', 'y2', 'y3'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestPairInterface.test_non_cross_wrapping ___________________ self = p = common = , layers = [] variables = ['x', 'x0', 'x1', 'x2', 'x3', 'y0', ...] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7233620> args = ('x', (.identity at 0x7f3fd5fdbcc0>, .identity at 0x7f3fd5fdbcc0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_non_cross_wrapping(self, long_df): x_vars = ["a", "b", "c", "t"] y_vars = ["f", "x", "y", "z"] wrap = 3 p = ( Plot(long_df, x="x") .pair(x=x_vars, y=y_vars, wrap=wrap, cross=False) > .plot() ^^^^^^ ) tests/_core/test_plot.py:1715: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'x0', 'x1', 'x2', 'x3', 'y0', ...] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestPairInterface.test_orient_inference ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6674980> args = ('x', (.identity at 0x7f3fd5be8e00>, .identity at 0x7f3fd5be8e00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_orient_inference(self, long_df): orient_list = [] class CaptureOrientMove(Move): def __call__(self, data, groupby, orient, scales): orient_list.append(orient) return data ( Plot(long_df, x="x") .pair(y=["b", "z"]) .add(MockMark(), CaptureOrientMove()) > .plot() ^^^^^^ ) tests/_core/test_plot.py:1740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y0', 'y1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________ TestPairInterface.test_computed_coordinate_orient_inference __________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6674ad0> args = ('y', (.identity at 0x7f3fd618a980>, .identity at 0x7f3fd618a980>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_computed_coordinate_orient_inference(self, long_df): class MockComputeStat(Stat): def __call__(self, df, groupby, orient, scales): other = {"x": "y", "y": "x"}[orient] return df.assign(**{other: df[orient] * 2}) m = MockMark() > Plot(long_df, y="y").add(m, MockComputeStat()).plot() tests/_core/test_plot.py:1753: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPairInterface.test_limits _________________________ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6676f90> args = ('y', (.identity at 0x7f3fd5819d20>, .identity at 0x7f3fd5819d20>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_limits(self, long_df): lims = (-3, 10), (-2, 24) > p = Plot(long_df, y="y").pair(x=["x", "z"]).limit(x=lims[0], x1=lims[1]).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1766: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPairInterface.test_labels _________________________ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd71b1d30> args = ('y', (.identity at 0x7f3fd5322820>, .identity at 0x7f3fd5322820>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_labels(self, long_df): label = "zed" p = ( Plot(long_df, y="y") .pair(x=["x", "z"]) .label(x=str.capitalize, x1=label) ) > ax0, ax1 = p.plot()._figure.axes ^^^^^^^^ tests/_core/test_plot.py:1778: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y', 'x0', 'x1'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestLabelVisibility.test_single_subplot ____________________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6f4d6a0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] def test_single_subplot(self, long_df): x, y = "a", "z" > p = Plot(long_df, x=x, y=y).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________ TestLabelVisibility.test_1d_column[facet_kws0-pair_kws0] ___________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6b89010> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] facet_kws = {'col': 'b'}, pair_kws = {} @pytest.mark.parametrize( "facet_kws,pair_kws", [({"col": "b"}, {}), ({}, {"x": ["x", "y", "f"]})] ) def test_1d_column(self, long_df, facet_kws, pair_kws): x = None if "x" in pair_kws else "a" y = "z" > p = Plot(long_df, x=x, y=y).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1803: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________ TestLabelVisibility.test_1d_column[facet_kws1-pair_kws1] ___________ self = p = common = , layers = [] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ae0980> args = ('y', (.identity at 0x7f3fd63e6350>, .identity at 0x7f3fd63e6350>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] facet_kws = {}, pair_kws = {'x': ['x', 'y', 'f']} @pytest.mark.parametrize( "facet_kws,pair_kws", [({"col": "b"}, {}), ({}, {"x": ["x", "y", "f"]})] ) def test_1d_column(self, long_df, facet_kws, pair_kws): x = None if "x" in pair_kws else "a" y = "z" > p = Plot(long_df, x=x, y=y).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1803: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `y` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________ TestLabelVisibility.test_1d_row[facet_kws0-pair_kws0] _____________ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd66282f0> args = ('x', (.identity at 0x7f3fd6011d20>, .identity at 0x7f3fd6011d20>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] facet_kws = {'row': 'b'}, pair_kws = {} @pytest.mark.parametrize( "facet_kws,pair_kws", [({"row": "b"}, {}), ({}, {"y": ["x", "y", "f"]})] ) def test_1d_row(self, long_df, facet_kws, pair_kws): x = "z" y = None if "y" in pair_kws else "z" > p = Plot(long_df, x=x, y=y).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1826: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________ TestLabelVisibility.test_1d_row[facet_kws1-pair_kws1] _____________ self = p = common = , layers = [] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd662bb60> args = ('x', (.identity at 0x7f3fd589ce00>, .identity at 0x7f3fd589ce00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = long_df = x y z a b ... s f a_cat s_cat s_str 0 12 0.449243 6.611886 b p ... 2 0.2 b ... 8 0.3 a 8 8 99 15 0.073484 1.036343 c p ... 8 0.2 c 8 8 [100 rows x 13 columns] facet_kws = {}, pair_kws = {'y': ['x', 'y', 'f']} @pytest.mark.parametrize( "facet_kws,pair_kws", [({"row": "b"}, {}), ({}, {"y": ["x", "y", "f"]})] ) def test_1d_row(self, long_df, facet_kws, pair_kws): x = "z" y = None if "y" in pair_kws else "z" > p = Plot(long_df, x=x, y=y).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1826: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = , layers = [] variables = ['x'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestLegend.test_single_layer_single_variable _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd60af4d0> args = ('x', (.identity at 0x7f3fd6aa5dd0>, .identity at 0x7f3fd6aa5dd0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_single_layer_single_variable(self, xy): s = pd.Series(["a", "b", "a", "c"], name="s") > p = Plot(**xy).add(MockMark(), color=s).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:1981: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestLegend.test_single_layer_common_variable _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd618f0e0> args = ('x', (.identity at 0x7f3fd72c90c0>, .identity at 0x7f3fd72c90c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_single_layer_common_variable(self, xy): s = pd.Series(["a", "b", "a", "c"], name="s") sem = dict(color=s, marker=s) > p = Plot(**xy).add(MockMark(), **sem).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2000: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________ TestLegend.test_single_layer_common_unnamed_variable _____________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd70d01a0> args = ('x', (.identity at 0x7f3fd656bab0>, .identity at 0x7f3fd656bab0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_single_layer_common_unnamed_variable(self, xy): s = np.array(["a", "b", "a", "c"]) sem = dict(color=s, marker=s) > p = Plot(**xy).add(MockMark(), **sem).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2019: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestLegend.test_single_layer_multi_variable __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5becd70> args = ('x', (.identity at 0x7f3fd5f7fd70>, .identity at 0x7f3fd5f7fd70>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_single_layer_multi_variable(self, xy): s1 = pd.Series(["a", "b", "a", "c"], name="s1") s2 = pd.Series(["m", "m", "p", "m"], name="s2") sem = dict(color=s1, marker=s2) > p = Plot(**xy).add(MockMark(), **sem).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2040: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestLegend.test_multi_layer_single_variable __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd746dd50>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd646b4d0> args = ('x', (.identity at 0x7f3fd618ada0>, .identity at 0x7f3fd618ada0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_multi_layer_single_variable(self, xy): s = pd.Series(["a", "b", "a", "c"], name="s") > p = Plot(**xy, color=s).add(MockMark()).add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2061: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd746dd50>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestLegend.test_multi_layer_multi_variable __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd5dcd4d0>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f05010> args = ('x', (.identity at 0x7f3fd7171fe0>, .identity at 0x7f3fd7171fe0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_multi_layer_multi_variable(self, xy): s1 = pd.Series(["a", "b", "a", "c"], name="s1") s2 = pd.Series(["m", "m", "p", "m"], name="s2") sem = dict(color=s1), dict(marker=s2) variables = {"s1": "color", "s2": "marker"} > p = Plot(**xy).add(MockMark(), **sem[0]).add(MockMark(), **sem[1]).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2085: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(arti...rn._core.data.PlotData object at 0x7f3fd5dcd4d0>, 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestLegend.test_multi_layer_different_artists _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestLegend.te...': None, 'legend': True, 'mark': TestLegend.test_multi_layer_different_artists..MockMark2(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd73102f0> args = ('x', (.identity at 0x7f3fd62e2fb0>, .identity at 0x7f3fd62e2fb0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_multi_layer_different_artists(self, xy): class MockMark1(MockMark): def _legend_artist(self, variables, value, scales): return mpl.lines.Line2D([], []) class MockMark2(MockMark): def _legend_artist(self, variables, value, scales): return mpl.patches.Patch() s = pd.Series(["a", "b", "a", "c"], name="s") > p = Plot(**xy, color=s).add(MockMark1()).add(MockMark2()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2112: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestLegend.te...': None, 'legend': True, 'mark': TestLegend.test_multi_layer_different_artists..MockMark2(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestLegend.test_three_layers _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestLegend.te...bf850>, 'label': None, 'legend': True, 'mark': TestLegend.test_three_layers..MockMarkLine(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5e35550> args = ('x', (.identity at 0x7f3fd65bf110>, .identity at 0x7f3fd65bf110>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_three_layers(self, xy): class MockMarkLine(MockMark): def _legend_artist(self, variables, value, scales): return mpl.lines.Line2D([], []) s = pd.Series(["a", "b", "a", "c"], name="s") p = Plot(**xy, color=s) for _ in range(3): p = p.add(MockMarkLine()) > p = p.plot() ^^^^^^^^ tests/_core/test_plot.py:2135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestLegend.te...bf850>, 'label': None, 'legend': True, 'mark': TestLegend.test_three_layers..MockMarkLine(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________ TestLegend.test_identity_scale_ignored ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6674980> args = ('x', (.identity at 0x7f3fd654f740>, .identity at 0x7f3fd654f740>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_identity_scale_ignored(self, xy): s = pd.Series(["r", "g", "b", "g"]) > p = Plot(**xy).add(MockMark(), color=s).scale(color=None).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestLegend.test_suppression_in_add_method ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': False, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702eba0> args = ('x', (.identity at 0x7f3fd5340720>, .identity at 0x7f3fd5340720>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_suppression_in_add_method(self, xy): s = pd.Series(["a", "b", "a", "c"], name="s") > p = Plot(**xy).add(MockMark(), color=s, legend=False).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': False, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestLegend.test_anonymous_title ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c4b230> args = ('x', (.identity at 0x7f3fd61a6b90>, .identity at 0x7f3fd61a6b90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_anonymous_title(self, xy): > p = Plot(**xy, color=["a", "b", "c", "d"]).add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2153: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestLegend.test_legendless_mark ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestLegend.test_legendless_mark..NoLegendMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5de5160> args = ('x', (.identity at 0x7f3fd6c3a820>, .identity at 0x7f3fd6c3a820>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_legendless_mark(self, xy): class NoLegendMark(MockMark): def _legend_artist(self, variables, value, scales): return None > p = Plot(**xy, color=["a", "b", "c", "d"]).add(NoLegendMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': TestLegend.test_legendless_mark..NoLegendMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestLegend.test_legend_has_no_offset _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd70d3770> args = ('x', (.identity at 0x7f3fd6ed1010>, .identity at 0x7f3fd6ed1010>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_legend_has_no_offset(self, xy): color = np.add(xy["x"], 1e8) > p = Plot(**xy, color=color).add(MockMark()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestLegend.test_layer_legend _________________________ self = p = common = layers = [{'data': , 'label': 'a', 'legend': True, 'mark': MockMark(artis...orn._core.data.PlotData object at 0x7f3fd62dc950>, 'label': 'b', 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd71530e0> args = ('x', (.identity at 0x7f3fd72f83b0>, .identity at 0x7f3fd72f83b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_layer_legend(self, xy): > p = Plot(**xy).add(MockMark(), label="a").add(MockMark(), label="b").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': 'a', 'legend': True, 'mark': MockMark(artis...orn._core.data.PlotData object at 0x7f3fd62dc950>, 'label': 'b', 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestLegend.test_layer_legend_with_scale_legend ________________ self = p = common = layers = [{'data': , 'label': 'x', 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ec2a50> args = ('x', (.identity at 0x7f3fd6d06400>, .identity at 0x7f3fd6d06400>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_layer_legend_with_scale_legend(self, xy): s = pd.Series(["a", "b", "a", "c"], name="s") > p = Plot(**xy, color=s).add(MockMark(), label="x").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2186: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': 'x', 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestLegend.test_layer_legend_title ______________________ self = p = common = layers = [{'data': , 'label': 'x', 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd60ac6e0> args = ('x', (.identity at 0x7f3fd72fab90>, .identity at 0x7f3fd72fab90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = xy = {'x': [1, 2, 3, 4], 'y': [1, 2, 3, 4]} def test_layer_legend_title(self, xy): > p = Plot(**xy).add(MockMark(), label="x").label(legend="layer").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_plot.py:2196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': 'x', 'legend': True, 'mark': MockMark(artist_kws={}), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestContinuous.test_coordinate_defaults ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_coordinate_defaults(self, x): > s = Continuous()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:54: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fe06dcc20> args = ('x', (.identity at 0x7f3fd6253320>, .identity at 0x7f3fd6253320>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_coordinate_transform ___________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_coordinate_transform(self, x): > s = Continuous(trans="log")._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:59: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fe06dcd70> args = ('x', (.log at 0x7f3fd7687e20>, .exp at 0x7f3fd7685010>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________ TestContinuous.test_coordinate_transform_with_parameter ____________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_coordinate_transform_with_parameter(self, x): > s = Continuous(trans="pow3")._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:64: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7242f90> args = ('x', (.forward at 0x7f3fd7684880>, .inverse at 0x7f3fd7684930>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_interval_defaults _____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_interval_defaults(self, x): > s = Continuous()._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:75: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243230> args = ('x', (.identity at 0x7f3fd7684eb0>, .identity at 0x7f3fd7684eb0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_interval_with_range ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_interval_with_range(self, x): > s = Continuous((1, 3))._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:80: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243380> args = ('x', (.identity at 0x7f3fd7687cc0>, .identity at 0x7f3fd7687cc0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_interval_with_norm ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_interval_with_norm(self, x): > s = Continuous(norm=(3, 7))._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:85: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd72434d0> args = ('x', (.identity at 0x7f3fd620a4b0>, .identity at 0x7f3fd620a4b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError __________ TestContinuous.test_interval_with_range_norm_and_transform __________ self = x = 0 1 1 10 2 100 dtype: int64 def test_interval_with_range_norm_and_transform(self, x): x = pd.Series([1, 10, 100]) # TODO param order? > s = Continuous((2, 3), (10, 100), "log")._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:92: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243620> args = ('None', (.log at 0x7f3fd620a820>, .exp at 0x7f3fd620b530>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_interval_with_bools ____________________ self = def test_interval_with_bools(self): x = pd.Series([True, False, False]) > s = Continuous()._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:98: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243770> args = ('None', (.identity at 0x7f3fd62085c0>, .identity at 0x7f3fd62085c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________________ TestContinuous.test_color_defaults ______________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_color_defaults(self, x): cmap = color_palette("ch:", as_cmap=True) > s = Continuous()._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd72438c0> args = ('x', (.identity at 0x7f3fd620aae0>, .identity at 0x7f3fd620aae0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_color_named_values ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_color_named_values(self, x): cmap = color_palette("viridis", as_cmap=True) > s = Continuous("viridis")._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243a10> args = ('x', (.identity at 0x7f3fd620bd70>, .identity at 0x7f3fd620bd70>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_color_tuple_values ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_color_tuple_values(self, x): cmap = color_palette("blend:b,g", as_cmap=True) > s = Continuous(("b", "g"))._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243b60> args = ('x', (.identity at 0x7f3fd62094e0>, .identity at 0x7f3fd62094e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError __________________ TestContinuous.test_color_callable_values ___________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_color_callable_values(self, x): cmap = color_palette("light:r", as_cmap=True) > s = Continuous(cmap)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:122: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243cb0> args = ('x', (.identity at 0x7f3fd620b740>, .identity at 0x7f3fd620b740>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestContinuous.test_color_with_norm ______________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_color_with_norm(self, x): cmap = color_palette("ch:", as_cmap=True) > s = Continuous(norm=(3, 7))._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243e00> args = ('x', (.identity at 0x7f3fd64f1380>, .identity at 0x7f3fd64f1380>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_color_with_transform ___________________ self = x = 0 1.0 1 10.0 2 100.0 Name: x, dtype: float64 def test_color_with_transform(self, x): x = pd.Series([1, 10, 100], name="x", dtype=float) cmap = color_palette("ch:", as_cmap=True) > s = Continuous(trans="log")._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fe06dcd70> args = ('x', (.log at 0x7f3fd620b1c0>, .exp at 0x7f3fd620a8d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestContinuous.test_tick_locator _______________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_locator(self, x): locs = [.2, .5, .8] locator = mpl.ticker.FixedLocator(locs) > a = self.setup_ticks(x, locator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7243cb0> args = ('x', (.identity at 0x7f3fd620bd70>, .identity at 0x7f3fd620bd70>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestContinuous.test_tick_upto _________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_upto(self, x): for n in [2, 5, 10]: > a = self.setup_ticks(x, upto=n) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5f04980> args = ('x', (.identity at 0x7f3fd660ada0>, .identity at 0x7f3fd660ada0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestContinuous.test_tick_every ________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_every(self, x): for d in [.05, .2, .5]: > a = self.setup_ticks(x, every=d) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:160: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd618fa10> args = ('x', (.identity at 0x7f3fd5321b10>, .identity at 0x7f3fd5321b10>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_tick_every_between ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_every_between(self, x): lo, hi = .2, .8 for d in [.05, .2, .5]: > a = self.setup_ticks(x, every=d, between=(lo, hi)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c496a0> args = ('x', (.identity at 0x7f3fd618ada0>, .identity at 0x7f3fd618ada0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestContinuous.test_tick_at __________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_at(self, x): locs = [.2, .5, .9] > a = self.setup_ticks(x, at=locs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702cad0> args = ('x', (.identity at 0x7f3fd5c02140>, .identity at 0x7f3fd5c02140>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestContinuous.test_tick_count ________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_count(self, x): n = 8 > a = self.setup_ticks(x, count=n) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702c830> args = ('x', (.identity at 0x7f3fd5fd85c0>, .identity at 0x7f3fd5fd85c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_tick_count_between ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_count_between(self, x): n = 5 lo, hi = .2, .7 > a = self.setup_ticks(x, count=n, between=(lo, hi)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5be0> args = ('x', (.identity at 0x7f3fd738cf60>, .identity at 0x7f3fd738cf60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestContinuous.test_tick_minor ________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_tick_minor(self, x): n = 3 > a = self.setup_ticks(x, count=2, minor=n) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:193: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:39: in setup_ticks s = Continuous().tick(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6120> args = ('x', (.identity at 0x7f3fd738d640>, .identity at 0x7f3fd738d640>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestContinuous.test_log_tick_default _____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_log_tick_default(self, x): > s = Continuous(trans="log")._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5fd0> args = ('x', (.log at 0x7f3fd738da60>, .exp at 0x7f3fd738cd50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________________ TestContinuous.test_log_tick_upto _______________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_log_tick_upto(self, x): n = 3 > s = Continuous(trans="log").tick(upto=n)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5a90> args = ('x', (.log at 0x7f3fd738cbf0>, .exp at 0x7f3fd738fed0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________________ TestContinuous.test_log_tick_count ______________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_log_tick_count(self, x): with pytest.raises(RuntimeError, match="`count` requires"): Continuous(trans="log").tick(count=4) s = Continuous(trans="log").tick(count=4, between=(1, 1000)) > a = PseudoAxis(s._setup(x, Coordinate())._matplotlib_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5940> args = ('x', (.log at 0x7f3fd4f754e0>, .exp at 0x7f3fd4f759b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________ TestContinuous.test_log_tick_format_disabled _________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_log_tick_format_disabled(self, x): > s = Continuous(trans="log").label(base=None)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe7770> args = ('x', (.log at 0x7f3fd4f75220>, .exp at 0x7f3fd4f75640>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_symlog_tick_default ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_symlog_tick_default(self, x): > s = Continuous(trans="symlog")._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe7cb0> args = ('x', (.symlog at 0x7f3fd4f762a0>, .symexp at 0x7f3fd4f76ae0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestContinuous.test_label_formatter ______________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_formatter(self, x): fmt = mpl.ticker.FormatStrFormatter("%.3f") > a, locs = self.setup_labels(x, fmt) ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6a50> args = ('x', (.identity at 0x7f3fd4f77a00>, .identity at 0x7f3fd4f77a00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_label_like_pattern ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_like_pattern(self, x): > a, locs = self.setup_labels(x, like=".4f") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe67b0> args = ('x', (.identity at 0x7f3fd4f76560>, .identity at 0x7f3fd4f76560>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestContinuous.test_label_like_string _____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_like_string(self, x): > a, locs = self.setup_labels(x, like="x = {x:.1f}") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6e40> args = ('x', (.identity at 0x7f3fd4f75e80>, .identity at 0x7f3fd4f75e80>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_label_like_function ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_like_function(self, x): > a, locs = self.setup_labels(x, like="{:^5.1f}".format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe57f0> args = ('x', (.identity at 0x7f3fd4f77e20>, .identity at 0x7f3fd4f77e20>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestContinuous.test_label_base ________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_base(self, x): > a, locs = self.setup_labels(100 * x, base=2) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe56a0> args = ('x', (.identity at 0x7f3fd5fda090>, .identity at 0x7f3fd5fda090>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestContinuous.test_label_unit ________________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_unit(self, x): > a, locs = self.setup_labels(1000 * x, unit="g") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5400> args = ('x', (.identity at 0x7f3fd5fdbc10>, .identity at 0x7f3fd5fdbc10>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestContinuous.test_label_unit_with_sep ____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_unit_with_sep(self, x): > a, locs = self.setup_labels(1000 * x, unit=("", "g")) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe4ec0> args = ('x', (.identity at 0x7f3fd5fd9590>, .identity at 0x7f3fd5fd9590>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestContinuous.test_label_empty_unit _____________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_empty_unit(self, x): > a, locs = self.setup_labels(1000 * x, unit="") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_core/test_scales.py:46: in setup_labels s = Continuous().label(*args, **kwargs)._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe4d70> args = ('x', (.identity at 0x7f3fd5fdaa30>, .identity at 0x7f3fd5fdaa30>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________ TestContinuous.test_label_base_from_transform _________________ self = x = 0 1.0 1 3.0 2 9.0 Name: x, dtype: float64 def test_label_base_from_transform(self, x): s = Continuous(trans="log") > a = PseudoAxis(s._setup(x, Coordinate())._matplotlib_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe4c20> args = ('x', (.log at 0x7f3fd5fdbed0>, .exp at 0x7f3fd5fda140>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestNominal.test_coordinate_defaults _____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_coordinate_defaults(self, x): > s = Nominal()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:338: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe52b0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestNominal.test_coordinate_with_order ____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_coordinate_with_order(self, x): > s = Nominal(order=["a", "b", "c"])._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe4c20> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________ TestNominal.test_coordinate_with_subset_order _________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_coordinate_with_subset_order(self, x): > s = Nominal(order=["c", "a"])._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:348: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe4440> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestNominal.test_coordinate_axis _______________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_coordinate_axis(self, x): ax = mpl.figure.Figure().subplots() > s = Nominal()._setup(x, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe70e0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________ TestNominal.test_coordinate_axis_with_order __________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_coordinate_axis_with_order(self, x): order = ["a", "b", "c"] ax = mpl.figure.Figure().subplots() > s = Nominal(order=order)._setup(x, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:363: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe6cf0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________ TestNominal.test_coordinate_axis_with_subset_order ______________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_coordinate_axis_with_subset_order(self, x): order = ["c", "a"] ax = mpl.figure.Figure().subplots() > s = Nominal(order=order)._setup(x, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:372: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe5010> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________ TestNominal.test_coordinate_axis_with_category_dtype _____________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: category Categories (4, object): ['b', 'a', 'd', 'c'] def test_coordinate_axis_with_category_dtype(self, x): order = ["b", "a", "d", "c"] x = x.astype(pd.CategoricalDtype(order)) ax = mpl.figure.Figure().subplots() > s = Nominal()._setup(x, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:382: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702d550> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestNominal.test_coordinate_numeric_data ___________________ self = y = 0 1.0 1 -1.5 2 3.0 3 -1.5 Name: y, dtype: float64 def test_coordinate_numeric_data(self, y): ax = mpl.figure.Figure().subplots() > s = Nominal()._setup(y, Coordinate(), ax.yaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702d400> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________ TestNominal.test_coordinate_numeric_data_with_order ______________ self = y = 0 1.0 1 -1.5 2 3.0 3 -1.5 Name: y, dtype: float64 def test_coordinate_numeric_data_with_order(self, y): order = [1, 4, -1.5] ax = mpl.figure.Figure().subplots() > s = Nominal(order=order)._setup(y, Coordinate(), ax.yaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:399: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702fe00> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestNominal.test_color_defaults ________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_color_defaults(self, x): > s = Nominal()._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702dfd0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestNominal.test_color_named_palette _____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_color_named_palette(self, x): pal = "flare" > s = Nominal(pal)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702dd30> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestNominal.test_color_list_palette ______________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_color_list_palette(self, x): cs = color_palette("crest", 3) > s = Nominal(cs)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:420: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702eba0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestNominal.test_color_dict_palette ______________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_color_dict_palette(self, x): cs = color_palette("crest", 3) pal = dict(zip("bac", cs)) > s = Nominal(pal)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:427: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702cd70> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestNominal.test_color_numeric_data ______________________ self = y = 0 1.0 1 -1.5 2 3.0 3 -1.5 Name: y, dtype: float64 def test_color_numeric_data(self, y): > s = Nominal()._setup(y, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702e120> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________ TestNominal.test_color_numeric_with_order_subset _______________ self = y = 0 1.0 1 -1.5 2 3.0 3 -1.5 Name: y, dtype: float64 def test_color_numeric_with_order_subset(self, y): > s = Nominal(order=[-1.5, 1])._setup(y, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:438: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702ee40> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestNominal.test_color_alpha_in_palette ____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_color_alpha_in_palette(self, x): cs = [(.2, .2, .3, .5), (.1, .2, .3, 1), (.5, .6, .2, 0)] > s = Nominal(cs)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:455: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702f380> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestNominal.test_color_unknown_palette ____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_color_unknown_palette(self, x): pal = "not_a_palette" err = f"'{pal}' is not a valid palette name" with pytest.raises(ValueError, match=err): > Nominal(pal)._setup(x, Color()) tests/_core/test_scales.py:463: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702de80> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestNominal.test_object_defaults _______________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_object_defaults(self, x): class MockProperty(ObjectProperty): def _default_values(self, n): return list("xyz"[:n]) > s = Nominal()._setup(x, MockProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:471: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6c49be0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestNominal.test_object_list _________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_object_list(self, x): vs = ["x", "y", "z"] > s = Nominal(vs)._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:477: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6c4a3c0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestNominal.test_object_dict _________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_object_dict(self, x): vs = {"a": "x", "b": "y", "c": "z"} > s = Nominal(vs)._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:483: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6c49d30> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestNominal.test_object_order _________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_object_order(self, x): vs = ["x", "y", "z"] > s = Nominal(vs, order=["c", "a", "b"])._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702de80> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestNominal.test_object_order_subset _____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_object_order_subset(self, x): vs = ["x", "y"] > s = Nominal(vs, order=["a", "c"])._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:495: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702e510> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestNominal.test_objects_that_are_weird ____________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_objects_that_are_weird(self, x): vs = [("x", 1), (None, None, 0), {}] > s = Nominal(vs)._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702e120> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestNominal.test_alpha_default ________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_alpha_default(self, x): > s = Nominal()._setup(x, Alpha()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702eba0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________________ TestNominal.test_fill _____________________________ self = def test_fill(self): x = pd.Series(["a", "a", "b", "a"], name="x") > s = Nominal()._setup(x, Fill()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:512: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fe06dcd70> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError __________________________ TestNominal.test_fill_dict __________________________ self = def test_fill_dict(self): x = pd.Series(["a", "a", "b", "a"], name="x") vs = {"a": False, "b": True} > s = Nominal(vs)._setup(x, Fill()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:519: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fe06dcc20> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestNominal.test_fill_nunique_warning _____________________ self = def test_fill_nunique_warning(self): x = pd.Series(["a", "b", "c", "a", "b"], name="x") with pytest.warns(UserWarning, match="The variable assigned to fill"): > s = Nominal()._setup(x, Fill()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:526: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702dd30> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError During handling of the above exception, another exception occurred: self = def test_fill_nunique_warning(self): x = pd.Series(["a", "b", "c", "a", "b"], name="x") > with pytest.warns(UserWarning, match="The variable assigned to fill"): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E Failed: DID NOT WARN. No warnings of type (,) were emitted. E Emitted warnings: []. tests/_core/test_scales.py:525: Failed ______________________ TestNominal.test_interval_defaults ______________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_interval_defaults(self, x): class MockProperty(IntervalProperty): _default_range = (1, 2) > s = Nominal()._setup(x, MockProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:534: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702f230> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestNominal.test_interval_tuple ________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_interval_tuple(self, x): > s = Nominal((1, 2))._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:539: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702d940> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestNominal.test_interval_tuple_numeric ____________________ self = y = 0 1.0 1 -1.5 2 3.0 3 -1.5 Name: y, dtype: float64 def test_interval_tuple_numeric(self, y): > s = Nominal((1, 2))._setup(y, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:544: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702ea50> args = ('y',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestNominal.test_interval_list ________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_interval_list(self, x): vs = [2, 5, 4] > s = Nominal(vs)._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:550: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702da90> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestNominal.test_interval_dict ________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_interval_dict(self, x): vs = {"a": 3, "b": 4, "c": 6} > s = Nominal(vs)._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702f0e0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ___________________ TestNominal.test_interval_with_transform ___________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_interval_with_transform(self, x): class MockProperty(IntervalProperty): _forward = np.square _inverse = np.sqrt > s = Nominal((2, 4))._setup(x, MockProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702e900> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestNominal.test_empty_data __________________________ self = def test_empty_data(self): x = pd.Series([], dtype=object, name="x") > s = Nominal()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:571: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702fcb0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError __________________________ TestNominal.test_finalize ___________________________ self = x = 0 a 1 c 2 b 3 c Name: x, dtype: object def test_finalize(self, x): ax = mpl.figure.Figure().subplots() > s = Nominal()._setup(x, Coordinate(), ax.yaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:577: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd702f620> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestTemporal.test_coordinate_defaults _____________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_coordinate_defaults(self, t, x): > s = Temporal()._setup(t, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702ecf0> args = ('x', (.identity at 0x7f3fd717af00>, .identity at 0x7f3fd717af00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestTemporal.test_interval_defaults ______________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_interval_defaults(self, t, x): > s = Temporal()._setup(t, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:605: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702ef90> args = ('x', (.identity at 0x7f3fd717b950>, .identity at 0x7f3fd717b950>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________ TestTemporal.test_interval_with_range _____________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_interval_with_range(self, t, x): values = (1, 3) > s = Temporal((1, 3))._setup(t, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:612: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702cc20> args = ('x', (.identity at 0x7f3fd717aae0>, .identity at 0x7f3fd717aae0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestTemporal.test_interval_with_norm _____________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_interval_with_norm(self, t, x): norm = t[1], t[2] > s = Temporal(norm=norm)._setup(t, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:620: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702c6e0> args = ('x', (.identity at 0x7f3fd6b4a8d0>, .identity at 0x7f3fd6b4a8d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestTemporal.test_color_defaults _______________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_color_defaults(self, t, x): cmap = color_palette("ch:", as_cmap=True) > s = Temporal()._setup(t, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:628: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702d550> args = ('x', (.identity at 0x7f3fd6b4a770>, .identity at 0x7f3fd6b4a770>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestTemporal.test_color_named_values _____________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_color_named_values(self, t, x): name = "viridis" cmap = color_palette(name, as_cmap=True) > s = Temporal(name)._setup(t, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702f4d0> args = ('x', (.identity at 0x7f3fd6b4ae50>, .identity at 0x7f3fd6b4ae50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________________ TestTemporal.test_coordinate_axis _______________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_coordinate_axis(self, t, x): ax = mpl.figure.Figure().subplots() > s = Temporal()._setup(t, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe46e0> args = ('x', (.identity at 0x7f3fd6ed2fb0>, .identity at 0x7f3fd6ed2fb0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestTemporal.test_tick_locator ________________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] def test_tick_locator(self, t): locator = mpl.dates.YearLocator(month=3, day=15) s = Temporal().tick(locator) > a = PseudoAxis(s._setup(t, Coordinate())._matplotlib_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:654: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe70e0> args = ('x', (.identity at 0x7f3fd6ed35e0>, .identity at 0x7f3fd6ed35e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestTemporal.test_tick_upto __________________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_tick_upto(self, t, x): n = 8 ax = mpl.figure.Figure().subplots() > Temporal().tick(upto=n)._setup(t, Coordinate(), ax.xaxis) tests/_core/test_scales.py:662: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702f620> args = ('x', (.identity at 0x7f3fd6b49640>, .identity at 0x7f3fd6b49640>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________________ TestTemporal.test_label_formatter _______________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] def test_label_formatter(self, t): formatter = mpl.dates.DateFormatter("%Y") s = Temporal().label(formatter) > a = PseudoAxis(s._setup(t, Coordinate())._matplotlib_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fe06dcd70> args = ('x', (.identity at 0x7f3fd6b4b1c0>, .identity at 0x7f3fd6b4b1c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestTemporal.test_label_concise ________________________ self = t = 0 1972-09-27 1 1975-06-24 2 1980-12-14 Name: x, dtype: datetime64[ns] x = 0 1000.0 1 2000.0 2 4000.0 Name: x, dtype: float64 def test_label_concise(self, t, x): ax = mpl.figure.Figure().subplots() > Temporal().label(concise=True)._setup(t, Coordinate(), ax.xaxis) tests/_core/test_scales.py:678: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702d010> args = ('x', (.identity at 0x7f3fd618a770>, .identity at 0x7f3fd618a770>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestBoolean.test_coordinate __________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_coordinate(self, x): > s = Boolean()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:691: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702f8c0> args = ('x', (.identity at 0x7f3fd618a820>, .identity at 0x7f3fd618a820>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestBoolean.test_coordinate_axis _______________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_coordinate_axis(self, x): ax = mpl.figure.Figure().subplots() > s = Boolean()._setup(x, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702f230> args = ('x', (.identity at 0x7f3fd4f75220>, .identity at 0x7f3fd4f75220>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________ TestBoolean.test_coordinate_missing[object-nan] ________________ self = x = 0 True 1 False 2 NaN 3 True Name: x, dtype: object dtype = , value = nan @pytest.mark.parametrize( "dtype,value", [ (object, np.nan), (object, None), ("boolean", pd.NA), ] ) def test_coordinate_missing(self, x, dtype, value): x = x.astype(dtype) x[2] = value > s = Boolean()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702dfd0> args = ('x', (.identity at 0x7f3fd4f77530>, .identity at 0x7f3fd4f77530>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________ TestBoolean.test_coordinate_missing[object-None] _______________ self = x = 0 True 1 False 2 None 3 True Name: x, dtype: object dtype = , value = None @pytest.mark.parametrize( "dtype,value", [ (object, np.nan), (object, None), ("boolean", pd.NA), ] ) def test_coordinate_missing(self, x, dtype, value): x = x.astype(dtype) x[2] = value > s = Boolean()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702dd30> args = ('x', (.identity at 0x7f3fd4f77320>, .identity at 0x7f3fd4f77320>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________ TestBoolean.test_coordinate_missing[boolean-value2] ______________ self = x = 0 True 1 False 2 3 True Name: x, dtype: boolean dtype = 'boolean', value = @pytest.mark.parametrize( "dtype,value", [ (object, np.nan), (object, None), ("boolean", pd.NA), ] ) def test_coordinate_missing(self, x, dtype, value): x = x.astype(dtype) x[2] = value > s = Boolean()._setup(x, Coordinate()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe56a0> args = ('x', (.identity at 0x7f3fd6ed14e0>, .identity at 0x7f3fd6ed14e0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestBoolean.test_color_defaults ________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_color_defaults(self, x): > s = Boolean()._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:719: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5550> args = ('x', (.identity at 0x7f3fd6ed26c0>, .identity at 0x7f3fd6ed26c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestBoolean.test_color_list_palette ______________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_color_list_palette(self, x): cs = color_palette("crest", 2) > s = Boolean(cs)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:727: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6660> args = ('x', (.identity at 0x7f3fd6ed0300>, .identity at 0x7f3fd6ed0300>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestBoolean.test_color_tuple_palette _____________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_color_tuple_palette(self, x): cs = tuple(color_palette("crest", 2)) > s = Boolean(cs)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:734: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5e80> args = ('x', (.identity at 0x7f3fd6ed31c0>, .identity at 0x7f3fd6ed31c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _____________________ TestBoolean.test_color_dict_palette ______________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_color_dict_palette(self, x): cs = color_palette("crest", 2) pal = {True: cs[0], False: cs[1]} > s = Boolean(pal)._setup(x, Color()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe5d30> args = ('x', (.identity at 0x7f3fd6ed0d50>, .identity at 0x7f3fd6ed0d50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestBoolean.test_object_defaults _______________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_object_defaults(self, x): vs = ["x", "y", "z"] class MockProperty(ObjectProperty): def _default_values(self, n): return vs[:n] > s = Boolean()._setup(x, MockProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:754: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe4ec0> args = ('x', (.identity at 0x7f3fd71e4250>, .identity at 0x7f3fd71e4250>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestBoolean.test_object_list _________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_object_list(self, x): vs = ["x", "y"] > s = Boolean(vs)._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:761: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe4d70> args = ('x', (.identity at 0x7f3fd71e5a60>, .identity at 0x7f3fd71e5a60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _________________________ TestBoolean.test_object_dict _________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_object_dict(self, x): vs = {True: "x", False: "y"} > s = Boolean(vs)._setup(x, ObjectProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe63c0> args = ('x', (.identity at 0x7f3fd71e43b0>, .identity at 0x7f3fd71e43b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ____________________________ TestBoolean.test_fill _____________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_fill(self, x): > s = Boolean()._setup(x, Fill()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:774: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6cf0> args = ('x', (.identity at 0x7f3fd71e6090>, .identity at 0x7f3fd71e6090>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ______________________ TestBoolean.test_interval_defaults ______________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_interval_defaults(self, x): vs = (1, 2) class MockProperty(IntervalProperty): _default_range = vs > s = Boolean()._setup(x, MockProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe78c0> args = ('x', (.identity at 0x7f3fd71e45c0>, .identity at 0x7f3fd71e45c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError _______________________ TestBoolean.test_interval_tuple ________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_interval_tuple(self, x): vs = (3, 5) > s = Boolean(vs)._setup(x, IntervalProperty()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:791: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6f90> args = ('x', (.identity at 0x7f3fd5c00720>, .identity at 0x7f3fd5c00720>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError __________________________ TestBoolean.test_finalize ___________________________ self = x = 0 True 1 False 2 False 3 True Name: x, dtype: bool def test_finalize(self, x): ax = mpl.figure.Figure().subplots() > s = Boolean()._setup(x, Coordinate(), ax.xaxis) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_core/test_scales.py:798: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:194: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe4830> args = ('x', (.identity at 0x7f3fd5bea6c0>, .identity at 0x7f3fd5bea6c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError ________________________ TestArea.test_single_defaults _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...=, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5966ba0> args = ('x', (.identity at 0x7f3fd58becf0>, .identity at 0x7f3fd58becf0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_single_defaults(self): x, y = [1, 2, 3], [1, 2, 1] > p = Plot(x=x, y=y).add(Area()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_area.py:16: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...=, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestArea.test_set_properties _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...'.33', alpha=0.3, fill=, edgecolor='.88', edgealpha=0.8, edgewidth=2, edgestyle=(0, (2, 1)), baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4f6a660> args = ('x', (.identity at 0x7f3fd4f71a60>, .identity at 0x7f3fd4f71a60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_set_properties(self): x, y = [1, 2, 3], [1, 2, 1] mark = Area( color=".33", alpha=.3, edgecolor=".88", edgealpha=.8, edgewidth=2, edgestyle=(0, (2, 1)), ) > p = Plot(x=x, y=y).add(mark).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_area.py:48: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...'.33', alpha=0.3, fill=, edgecolor='.88', edgealpha=0.8, edgewidth=2, edgestyle=(0, (2, 1)), baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestArea.test_mapped_properties ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...=, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd57c9fd0> args = ('x', (.identity at 0x7f3fd4fe07d0>, .identity at 0x7f3fd4fe07d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_properties(self): x, y = [1, 2, 3, 2, 3, 4], [1, 2, 1, 1, 3, 2] g = ["a", "a", "a", "b", "b", "b"] cs = [".2", ".8"] > p = Plot(x=x, y=y, color=g, edgewidth=g).scale(color=cs).add(Area()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_area.py:71: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...=, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestArea.test_unfilled ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...l=False, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5865940> args = ('x', (.identity at 0x7f3fd585f480>, .identity at 0x7f3fd585f480>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_unfilled(self): x, y = [1, 2, 3], [1, 2, 1] c = ".5" > p = Plot(x=x, y=y).add(Area(fill=False, color=c)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_area.py:95: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Area(artist_k...l=False, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestBand.test_range ______________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Band(artist_k...color=<'C0'>, alpha=<0.2>, fill=, edgecolor=, edgealpha=<1>, edgewidth=<0>, edgestyle=<'-'>), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4bf9400> args = ('x', (.identity at 0x7f3fd57a21f0>, .identity at 0x7f3fd57a21f0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_range(self): x, ymin, ymax = [1, 2, 4], [2, 1, 4], [3, 3, 5] > p = Plot(x=x, ymin=ymin, ymax=ymax).add(Band()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_area.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Band(artist_k...color=<'C0'>, alpha=<0.2>, fill=, edgecolor=, edgealpha=<1>, edgewidth=<0>, edgestyle=<'-'>), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestBand.test_auto_range ___________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Band(artist_k...color=<'C0'>, alpha=<0.2>, fill=, edgecolor=, edgealpha=<1>, edgewidth=<0>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b9cd70> args = ('x', (.identity at 0x7f3fd4be4f60>, .identity at 0x7f3fd4be4f60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_auto_range(self): x = [1, 1, 2, 2, 2] y = [1, 2, 3, 4, 5] > p = Plot(x=x, y=y).add(Band()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_area.py:120: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Band(artist_k...color=<'C0'>, alpha=<0.2>, fill=, edgecolor=, edgealpha=<1>, edgewidth=<0>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________ TestBar.test_categorical_positions_vertical __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd4c20830> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_categorical_positions_vertical(self): x = ["a", "b"] y = [1, 2] w = .8 > bars = self.plot_bars({"x": x, "y": y}, {}, {}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:33: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_marks/test_bar.py:17: in plot_bars p = Plot(**variables).add(Bar(**mark_kws), **layer_kws).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestBar.test_categorical_positions_horizontal _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4c21550> args = ('x', (.identity at 0x7f3fd4c028d0>, .identity at 0x7f3fd4c028d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_categorical_positions_horizontal(self): x = [1, 2] y = ["a", "b"] w = .8 > bars = self.plot_bars({"x": x, "y": y}, {}, {}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:42: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_marks/test_bar.py:17: in plot_bars p = Plot(**variables).add(Bar(**mark_kws), **layer_kws).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestBar.test_numeric_positions_vertical ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7310050> args = ('x', (.identity at 0x7f3fd585fcc0>, .identity at 0x7f3fd585fcc0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_numeric_positions_vertical(self): x = [1, 2] y = [3, 4] w = .8 > bars = self.plot_bars({"x": x, "y": y}, {}, {}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:51: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_marks/test_bar.py:17: in plot_bars p = Plot(**variables).add(Bar(**mark_kws), **layer_kws).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestBar.test_numeric_positions_horizontal ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702ee40> args = ('x', (.identity at 0x7f3fd6009430>, .identity at 0x7f3fd6009430>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_numeric_positions_horizontal(self): x = [1, 2] y = [3, 4] w = .8 > bars = self.plot_bars({"x": x, "y": y}, {}, {"orient": "h"}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:60: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/_marks/test_bar.py:17: in plot_bars p = Plot(**variables).add(Bar(**mark_kws), **layer_kws).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestBar.test_set_properties __________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...pha=0.5, fill=, edgecolor='.3', edgealpha=0.9, edgewidth=1.5, edgestyle=(2, 1), width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd4c23380> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_set_properties(self): x = ["a", "b", "c"] y = [1, 3, 2] mark = Bar( color=".8", alpha=.5, edgecolor=".3", edgealpha=.9, edgestyle=(2, 1), edgewidth=1.5, ) > p = Plot(x, y).add(mark).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:78: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...pha=0.5, fill=, edgecolor='.3', edgealpha=0.9, edgewidth=1.5, edgestyle=(2, 1), width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestBar.test_mapped_properties ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd4c23cb0> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_properties(self): x = ["a", "b"] y = [1, 2] mark = Bar(alpha=.2) > p = Plot(x, y, color=x, edgewidth=y).add(mark).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestBar.test_zero_height_skipped _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6543620> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_zero_height_skipped(self): > p = Plot(["a", "b", "c"], [1, 0, 2]).add(Bar()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestBar.test_artist_kws_clip _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:288: in _setup mpl_scale = CatScale(data.name) ^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .CatScale object at 0x7f3fd6fe6f90> args = ('x',), kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: LinearScale.__init__() takes 2 positional arguments but 3 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_artist_kws_clip(self): > p = Plot(["a", "b"], [1, 2]).add(Bar({"clip_on": False})).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bar(artist_kw...color=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<0.8>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestBars.test_positions ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd73e2a50> args = ('x', (.identity at 0x7f3fd5321590>, .identity at 0x7f3fd5321590>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = x = 0 4 1 5 2 6 3 7 4 8 Name: x, dtype: int64 y = 0 2 1 8 2 3 3 5 4 9 Name: y, dtype: int64 def test_positions(self, x, y): > p = Plot(x, y).add(Bars()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestBars.test_positions_horizontal ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c7a3c0> args = ('x', (.identity at 0x7f3fd5047c10>, .identity at 0x7f3fd5047c10>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = x = 0 4 1 5 2 6 3 7 4 8 Name: x, dtype: int64 y = 0 2 1 8 2 3 3 5 4 9 Name: y, dtype: int64 def test_positions_horizontal(self, x, y): > p = Plot(x=y, y=x).add(Bars(), orient="h").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestBars.test_width ______________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=0.4, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f01d30> args = ('x', (.identity at 0x7f3fd4f77320>, .identity at 0x7f3fd4f77320>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = x = 0 4 1 5 2 6 3 7 4 8 Name: x, dtype: int64 y = 0 2 1 8 2 3 3 5 4 9 Name: y, dtype: int64 def test_width(self, x, y): > p = Plot(x, y).add(Bars(width=.4)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=0.4, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestBars.test_mapped_color_direct_alpha ____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd721d6a0> args = ('x', (.identity at 0x7f3fd5fda090>, .identity at 0x7f3fd5fda090>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = x = 0 4 1 5 2 6 3 7 4 8 Name: x, dtype: int64 y = 0 2 1 8 2 3 3 5 4 9 Name: y, dtype: int64 color = 0 a 1 b 2 c 3 a 4 c Name: color, dtype: object def test_mapped_color_direct_alpha(self, x, y, color): alpha = .5 > p = Plot(x, y, color=color).add(Bars(alpha=alpha)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestBars.test_mapped_edgewidth ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6cad160> args = ('x', (.identity at 0x7f3fd7179640>, .identity at 0x7f3fd7179640>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = x = 0 4 1 5 2 6 3 7 4 8 Name: x, dtype: int64 y = 0 2 1 8 2 3 3 5 4 9 Name: y, dtype: int64 def test_mapped_edgewidth(self, x, y): > p = Plot(x, y, edgewidth=y).add(Bars()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestBars.test_auto_edgewidth _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6d20ad0> args = ('x', (.identity at 0x7f3fd57c6140>, .identity at 0x7f3fd57c6140>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_auto_edgewidth(self): x0 = np.arange(10) x1 = np.arange(1000) > p0 = Plot(x0, x0).add(Bars()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:186: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestBars.test_unfilled ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...ha=<0.7>, fill=False, edgecolor='C4', edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ae0590> args = ('x', (.identity at 0x7f3fd4f67320>, .identity at 0x7f3fd4f67320>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = x = 0 4 1 5 2 6 3 7 4 8 Name: x, dtype: int64 y = 0 2 1 8 2 3 3 5 4 9 Name: y, dtype: int64 def test_unfilled(self, x, y): > p = Plot(x, y).add(Bars(fill=False, edgecolor="C4")).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...ha=<0.7>, fill=False, edgecolor='C4', edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestBars.test_log_scale ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ae3e00> args = ('x', (.log at 0x7f3fd58bfed0>, .exp at 0x7f3fd62a3a00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_log_scale(self): x = y = [1, 10, 100, 1000] > p = Plot(x, y).add(Bars()).scale(x="log").plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_bar.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Bars(artist_k...rue>, edgecolor=, edgealpha=<1>, edgewidth=, edgestyle=<'-'>, width=<1>, baseline=<0>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestDot.test_simple ______________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6543b60> args = ('x', (.identity at 0x7f3fd4c01c70>, .identity at 0x7f3fd4c01c70>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_simple(self): x = [1, 2, 3] y = [4, 5, 2] > p = Plot(x=x, y=y).add(Dot()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:39: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestDot.test_filled_unfilled_mix _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...=2, color=<'C0'>, alpha=<1>, fill=, edgecolor='w', edgealpha=, edgewidth=1, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe67b0> args = ('x', (.identity at 0x7f3fd5923270>, .identity at 0x7f3fd5923270>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_filled_unfilled_mix(self): x = [1, 2] y = [4, 5] marker = ["a", "b"] shapes = ["o", "x"] mark = Dot(edgecolor="w", stroke=2, edgewidth=1) > p = Plot(x=x, y=y).add(mark, marker=marker).scale(marker=shapes).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:55: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...=2, color=<'C0'>, alpha=<1>, fill=, edgecolor='w', edgealpha=, edgewidth=1, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestDot.test_missing_coordinate_data _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4c20440> args = ('x', (.identity at 0x7f3fd4d17ed0>, .identity at 0x7f3fd4d17ed0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_missing_coordinate_data(self): x = [1, float("nan"), 3] y = [5, 3, 4] > p = Plot(x=x, y=y).add(Dot()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:71: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestDot.test_missing_semantic_data[color] ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7312f90> args = ('x', (.identity at 0x7f3fd72f9a60>, .identity at 0x7f3fd72f9a60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = , prop = 'color' @pytest.mark.parametrize("prop", ["color", "fill", "marker", "pointsize"]) def test_missing_semantic_data(self, prop): x = [1, 2, 3] y = [5, 3, 4] z = ["a", float("nan"), "b"] > p = Plot(x=x, y=y, **{prop: z}).add(Dot()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________ TestDot.test_missing_semantic_data[fill] ___________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5bef0e0> args = ('x', (.identity at 0x7f3fd7489010>, .identity at 0x7f3fd7489010>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = , prop = 'fill' @pytest.mark.parametrize("prop", ["color", "fill", "marker", "pointsize"]) def test_missing_semantic_data(self, prop): x = [1, 2, 3] y = [5, 3, 4] z = ["a", float("nan"), "b"] > p = Plot(x=x, y=y, **{prop: z}).add(Dot()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________ TestDot.test_missing_semantic_data[marker] __________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd70a01a0> args = ('x', (.identity at 0x7f3fd6c39430>, .identity at 0x7f3fd6c39430>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = , prop = 'marker' @pytest.mark.parametrize("prop", ["color", "fill", "marker", "pointsize"]) def test_missing_semantic_data(self, prop): x = [1, 2, 3] y = [5, 3, 4] z = ["a", float("nan"), "b"] > p = Plot(x=x, y=y, **{prop: z}).add(Dot()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________ TestDot.test_missing_semantic_data[pointsize] _________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd70a3a10> args = ('x', (.identity at 0x7f3fd654c5c0>, .identity at 0x7f3fd654c5c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = prop = 'pointsize' @pytest.mark.parametrize("prop", ["color", "fill", "marker", "pointsize"]) def test_missing_semantic_data(self, prop): x = [1, 2, 3] y = [5, 3, 4] z = ["a", float("nan"), "b"] > p = Plot(x=x, y=y, **{prop: z}).add(Dot()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:83: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dot(artist_kw...>, alpha=<1>, fill=, edgecolor=, edgealpha=, edgewidth=<0.5>, edgestyle=<'-'>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestDots.test_simple _____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k..., pointsize=<4>, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c7b4d0> args = ('x', (.identity at 0x7f3fd65377f0>, .identity at 0x7f3fd65377f0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_simple(self): x = [1, 2, 3] y = [4, 5, 2] > p = Plot(x=x, y=y).add(Dots()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:95: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k..., pointsize=<4>, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestDots.test_set_color ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...>, pointsize=<4>, stroke=<0.75>, color='.25', alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd604ee40> args = ('x', (.identity at 0x7f3fd57966c0>, .identity at 0x7f3fd57966c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_set_color(self): x = [1, 2, 3] y = [4, 5, 2] m = Dots(color=".25") > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:108: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...>, pointsize=<4>, stroke=<0.75>, color='.25', alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestDots.test_map_color ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k..., pointsize=<4>, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd72b67b0> args = ('x', (.identity at 0x7f3fd6b49e80>, .identity at 0x7f3fd6b49e80>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_map_color(self): x = [1, 2, 3] y = [4, 5, 2] c = ["a", "b", "a"] > p = Plot(x=x, y=y, color=c).add(Dots()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:120: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k..., pointsize=<4>, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________________ TestDots.test_fill ______________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...>, pointsize=<4>, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=False, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6d7e120> args = ('x', (.identity at 0x7f3fd5be8bf0>, .identity at 0x7f3fd5be8bf0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_fill(self): x = [1, 2, 3] y = [4, 5, 2] c = ["a", "b", "a"] > p = Plot(x=x, y=y, color=c).add(Dots(fill=False)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...>, pointsize=<4>, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=False, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestDots.test_pointsize ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...r>, pointsize=3, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd71b1a90> args = ('x', (.identity at 0x7f3fd61a68d0>, .identity at 0x7f3fd61a68d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_pointsize(self): x = [1, 2, 3] y = [4, 5, 2] s = 3 > p = Plot(x=x, y=y).add(Dots(pointsize=s)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...r>, pointsize=3, stroke=<0.75>, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestDots.test_stroke _____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...rker>, pointsize=<4>, stroke=3, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd60cd400> args = ('x', (.identity at 0x7f3fd7420880>, .identity at 0x7f3fd7420880>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_stroke(self): x = [1, 2, 3] y = [4, 5, 2] s = 3 > p = Plot(x=x, y=y).add(Dots(stroke=s)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...rker>, pointsize=<4>, stroke=3, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestDots.test_filled_unfilled_mix _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...rker>, pointsize=<4>, stroke=2, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6620d70> args = ('x', (.identity at 0x7f3fd5320d50>, .identity at 0x7f3fd5320d50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_filled_unfilled_mix(self): x = [1, 2] y = [4, 5] marker = ["a", "b"] shapes = ["o", "x"] mark = Dots(stroke=2) > p = Plot(x=x, y=y).add(mark, marker=marker).scale(marker=shapes).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_dot.py:171: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dots(artist_k...rker>, pointsize=<4>, stroke=2, color=<'C0'>, alpha=<1>, fill=, fillcolor=, fillalpha=<0.2>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestPath.test_xy_data _____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b58980> args = ('x', (.identity at 0x7f3fd6aa56f0>, .identity at 0x7f3fd6aa56f0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data(self): x = [1, 5, 3, np.nan, 2] y = [1, 4, 2, 5, 3] g = [1, 2, 1, 1, 2] > p = Plot(x=x, y=y, group=g).add(Path()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:20: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPath.test_shared_colors_direct ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5bec440> args = ('x', (.identity at 0x7f3fd5323ab0>, .identity at 0x7f3fd5323ab0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_shared_colors_direct(self): x = y = [1, 2, 3] color = ".44" m = Path(color=color) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:33: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestPath.test_separate_colors_direct _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...arker>, pointsize=, fillcolor='.77', edgecolor='.55', edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b5b4d0> args = ('x', (.identity at 0x7f3fd7420720>, .identity at 0x7f3fd7420720>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_separate_colors_direct(self): x = y = [1, 2, 3] y = [1, 2, 3] m = Path(color=".22", edgecolor=".55", fillcolor=".77") > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...arker>, pointsize=, fillcolor='.77', edgecolor='.55', edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestPath.test_shared_colors_mapped ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ae06e0> args = ('x', (.identity at 0x7f3fd7684510>, .identity at 0x7f3fd7684510>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_shared_colors_mapped(self): x = y = [1, 2, 3, 4] c = ["a", "a", "b", "b"] m = Path() > p = Plot(x=x, y=y, color=c).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:55: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________ TestPath.test_separate_colors_mapped _____________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702fa10> args = ('x', (.identity at 0x7f3fd7172350>, .identity at 0x7f3fd7172350>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_separate_colors_mapped(self): x = y = [1, 2, 3, 4] c = ["a", "a", "b", "b"] d = ["x", "y", "x", "y"] m = Path() > p = Plot(x=x, y=y, color=c, fillcolor=d).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPath.test_color_with_alpha ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...nes.markersize>, fillcolor=(0.2, 0.2, 0.3, 0.9), edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe46e0> args = ('x', (.identity at 0x7f3fd6009220>, .identity at 0x7f3fd6009220>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_color_with_alpha(self): x = y = [1, 2, 3] m = Path(color=(.4, .9, .2, .5), fillcolor=(.2, .2, .3, .9)) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...nes.markersize>, fillcolor=(0.2, 0.2, 0.3, 0.9), edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPath.test_color_and_alpha _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...rc:lines.markersize>, fillcolor=(0.2, 0.2, 0.3), edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4c22120> args = ('x', (.identity at 0x7f3fd7465b10>, .identity at 0x7f3fd7465b10>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_color_and_alpha(self): x = y = [1, 2, 3] m = Path(color=(.4, .9, .2), fillcolor=(.2, .2, .3), alpha=.5) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:91: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...rc:lines.markersize>, fillcolor=(0.2, 0.2, 0.3), edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPath.test_other_props_direct _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...th=3, linestyle='--', marker='s', pointsize=10, fillcolor=, edgecolor=, edgewidth=1), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f386e0> args = ('x', (.identity at 0x7f3fd5fda610>, .identity at 0x7f3fd5fda610>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_other_props_direct(self): x = y = [1, 2, 3] m = Path(marker="s", linestyle="--", linewidth=3, pointsize=10, edgewidth=1) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:101: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k...th=3, linestyle='--', marker='s', pointsize=10, fillcolor=, edgecolor=, edgewidth=1), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPath.test_other_props_mapped _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b5bb60> args = ('x', (.identity at 0x7f3fd4c03b60>, .identity at 0x7f3fd4c03b60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_other_props_mapped(self): x = y = [1, 2, 3, 4] g = ["a", "a", "b", "b"] m = Path() > p = Plot(x=x, y=y, marker=g, linestyle=g, pointsize=g).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestPath.test_capstyle ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd649f4d0> args = ('x', (.identity at 0x7f3fd6b487d0>, .identity at 0x7f3fd6b487d0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_capstyle(self): x = y = [1, 2] rc = {"lines.solid_capstyle": "projecting", "lines.dash_capstyle": "round"} > p = Plot(x, y).add(Path()).theme(rc).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Path(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestLine.test_xy_data _____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Line(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6096f90> args = ('x', (.identity at 0x7f3fd57943b0>, .identity at 0x7f3fd57943b0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data(self): x = [1, 5, 3, np.nan, 2] y = [1, 4, 2, 5, 3] g = [1, 2, 1, 1, 2] > p = Plot(x=x, y=y, group=g).add(Line()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Line(artist_k..., fillcolor=, edgecolor=, edgewidth=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestPaths.test_xy_data ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd604ea50> args = ('x', (.identity at 0x7f3fd62c6560>, .identity at 0x7f3fd62c6560>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data(self): x = [1, 5, 3, np.nan, 2] y = [1, 4, 2, 5, 3] g = [1, 2, 1, 1, 2] > p = Plot(x=x, y=y, group=g).add(Paths()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPaths.test_set_properties _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_kws={'capstyle': }, color='.737', alpha=<1>, linewidth=1, linestyle=(3, 1)), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f023c0> args = ('x', (.identity at 0x7f3fd6172ae0>, .identity at 0x7f3fd6172ae0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_set_properties(self): x = y = [1, 2, 3] m = Paths(color=".737", linewidth=1, linestyle=(3, 1)) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_kws={'capstyle': }, color='.737', alpha=<1>, linewidth=1, linestyle=(3, 1)), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPaths.test_mapped_properties _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ccdd30> args = ('x', (.identity at 0x7f3fd585eb90>, .identity at 0x7f3fd585eb90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_properties(self): x = y = [1, 2, 3, 4] g = ["a", "a", "b", "b"] > p = Plot(x=x, y=y, color=g, linewidth=g, linestyle=g).add(Paths()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:190: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestPaths.test_color_with_alpha ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...cting'>}, color=(0.2, 0.6, 0.9, 0.5), alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5e5d6a0> args = ('x', (.identity at 0x7f3fd53e7b60>, .identity at 0x7f3fd53e7b60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_color_with_alpha(self): x = y = [1, 2, 3] m = Paths(color=(.2, .6, .9, .5)) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...cting'>}, color=(0.2, 0.6, 0.9, 0.5), alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestPaths.test_color_and_alpha ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...projecting'>}, color=(0.2, 0.6, 0.9), alpha=0.5, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4be5010> args = ('x', (.identity at 0x7f3fd620ab90>, .identity at 0x7f3fd620ab90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_color_and_alpha(self): x = y = [1, 2, 3] m = Paths(color=(.2, .6, .9), alpha=.5) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:209: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...projecting'>}, color=(0.2, 0.6, 0.9), alpha=0.5, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ___________________________ TestPaths.test_capstyle ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6248980> args = ('x', (.identity at 0x7f3fd62e01a0>, .identity at 0x7f3fd62e01a0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_capstyle(self): x = y = [1, 2] rc = {"lines.solid_capstyle": "projecting"} with mpl.rc_context(rc): > p = Plot(x, y).add(Paths()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Paths(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestLines.test_xy_data ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Lines(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f38ec0> args = ('x', (.identity at 0x7f3fd6209170>, .identity at 0x7f3fd6209170>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data(self): x = [1, 5, 3, np.nan, 2] y = [1, 4, 2, 5, 3] g = [1, 2, 1, 1, 2] > p = Plot(x=x, y=y, group=g).add(Lines()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Lines(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ______________________ TestLines.test_single_orient_value ______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Lines(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4c21550> args = ('x', (.identity at 0x7f3fd6c3bed0>, .identity at 0x7f3fd6c3bed0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_single_orient_value(self): x = [1, 1, 1] y = [1, 2, 3] > p = Plot(x, y).add(Lines()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Lines(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestRange.test_xy_data ____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b9fe00> args = ('x', (.identity at 0x7f3fd7422350>, .identity at 0x7f3fd7422350>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data(self): x = [1, 2] ymin = [1, 4] ymax = [2, 3] > p = Plot(x=x, ymin=ymin, ymax=ymax).add(Range()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError __________________________ TestRange.test_auto_range ___________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4b9d160> args = ('x', (.identity at 0x7f3fd4d16610>, .identity at 0x7f3fd4d16610>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_auto_range(self): x = [1, 1, 2, 2, 2] y = [1, 2, 3, 4, 5] > p = Plot(x=x, y=y).add(Range()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestRange.test_mapped_color __________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd7311550> args = ('x', (.identity at 0x7f3fd53217a0>, .identity at 0x7f3fd53217a0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_color(self): x = [1, 2, 1, 2] ymin = [1, 4, 3, 2] ymax = [2, 3, 1, 4] group = ["a", "a", "b", "b"] > p = Plot(x=x, ymin=ymin, ymax=ymax, color=group).add(Range()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...ecting: 'projecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestRange.test_direct_properties _______________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...le': }, color='.654', alpha=<1>, linewidth=4, linestyle=), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd5bec6e0> args = ('x', (.identity at 0x7f3fd6aa6b90>, .identity at 0x7f3fd6aa6b90>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_direct_properties(self): x = [1, 2] ymin = [1, 4] ymax = [2, 3] m = Range(color=".654", linewidth=4) > p = Plot(x=x, ymin=ymin, ymax=ymax).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:312: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Range(artist_...le': }, color='.654', alpha=<1>, linewidth=4, linestyle=), ...}] variables = ['x', 'ymin', 'ymax'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ____________________________ TestDash.test_xy_data _____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe6ba0> args = ('x', (.identity at 0x7f3fd60c7950>, .identity at 0x7f3fd60c7950>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data(self): x = [0, 0, 1, 2] y = [1, 2, 3, 4] > p = Plot(x=x, y=y).add(Dash()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestDash.test_xy_data_grouped _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ae17f0> args = ('x', (.identity at 0x7f3fd57c6ae0>, .identity at 0x7f3fd57c6ae0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_xy_data_grouped(self): x = [0, 0, 1, 2] y = [1, 2, 3, 4] color = ["a", "b", "a", "b"] > p = Plot(x=x, y=y, color=color).add(Dash()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestDash.test_set_properties _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...yle.projecting: 'projecting'>}, color='.8', alpha=<1>, linewidth=4, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd624b770> args = ('x', (.identity at 0x7f3fd585ee50>, .identity at 0x7f3fd585ee50>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_set_properties(self): x = [0, 0, 1, 2] y = [1, 2, 3, 4] m = Dash(color=".8", linewidth=4) > p = Plot(x=x, y=y).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...yle.projecting: 'projecting'>}, color='.8', alpha=<1>, linewidth=4, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestDash.test_mapped_properties ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd65430e0> args = ('x', (.identity at 0x7f3fd6172610>, .identity at 0x7f3fd6172610>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_properties(self): x = [0, 1] y = [1, 2] color = ["a", "b"] linewidth = [1, 2] > p = Plot(x=x, y=y, color=color, linewidth=linewidth).add(Dash()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:371: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestDash.test_width ______________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ojecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=0.4), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6c2ea50> args = ('x', (.identity at 0x7f3fd76110c0>, .identity at 0x7f3fd76110c0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_width(self): x = [0, 0, 1, 2] y = [1, 2, 3, 4] > p = Plot(x=x, y=y).add(Dash(width=.4)).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ojecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=0.4), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestDash.test_dodge ______________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd733a3c0> args = ('x', (.identity at 0x7f3fd5428670>, .identity at 0x7f3fd5428670>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_dodge(self): x = [0, 1] y = [1, 2] group = ["a", "b"] > p = Plot(x=x, y=y, group=group).add(Dash(), Dodge()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_line.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Dash(artist_k...ecting'>}, color=<'C0'>, alpha=<1>, linewidth=, linestyle=, width=<0.8>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _____________________________ TestText.test_simple _____________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6f69e80> args = ('x', (.identity at 0x7f3fd717bb60>, .identity at 0x7f3fd717bb60>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_simple(self): x = y = [1, 2, 3] s = list("abc") > p = Plot(x, y, text=s).add(Text()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:26: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestText.test_set_properties _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_kws={}, text=<''>, color='red', alpha=0.6, fontsize=6, halign=<'center'>, valign='bottom', offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6d457f0> args = ('x', (.identity at 0x7f3fd5be9170>, .identity at 0x7f3fd5be9170>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_set_properties(self): x = y = [1, 2, 3] s = list("abc") color = "red" alpha = .6 fontsize = 6 valign = "bottom" m = Text(color=color, alpha=alpha, fontsize=fontsize, valign=valign) > p = Plot(x, y, text=s).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:46: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_kws={}, text=<''>, color='red', alpha=0.6, fontsize=6, halign=<'center'>, valign='bottom', offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestText.test_mapped_properties ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4fe1160> args = ('x', (.identity at 0x7f3fd61a7110>, .identity at 0x7f3fd61a7110>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_properties(self): x = y = [1, 2, 3] s = list("abc") color = list("aab") fontsize = [1, 2, 4] > p = Plot(x, y, color=color, fontsize=fontsize, text=s).add(Text()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestText.test_mapped_alignment ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6248c20> args = ('x', (.identity at 0x7f3fd6d04b40>, .identity at 0x7f3fd6d04b40>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_mapped_alignment(self): x = [1, 2] > p = Plot(x=x, y=x, halign=x, valign=x, text=x).add(Text()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:75: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _______________________ TestText.test_identity_fontsize ________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6fe7230> args = ('x', (.identity at 0x7f3fd61a6c40>, .identity at 0x7f3fd61a6c40>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_identity_fontsize(self): x = y = [1, 2, 3] s = list("abc") fs = [5, 8, 12] > p = Plot(x, y, text=s, fontsize=fs).add(Text()).scale(fontsize=None).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError ________________________ TestText.test_offset_centered _________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd4c201a0> args = ('x', (.identity at 0x7f3fd5be9bc0>, .identity at 0x7f3fd5be9bc0>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_offset_centered(self): x = y = [1, 2, 3] s = list("abc") > p = Plot(x, y, text=s).add(Text()).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:97: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_k...''>, color=<'k'>, alpha=<1>, fontsize=, halign=<'center'>, valign=<'center_baseline'>, offset=<4>), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestText.test_offset_valign __________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_kws={}, text=<''>, color=<'k'>, alpha=<1>, fontsize=5, halign=<'center'>, valign='bottom', offset=0.1), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd6ae38c0> args = ('x', (.identity at 0x7f3fd65bf060>, .identity at 0x7f3fd65bf060>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_offset_valign(self): x = y = [1, 2, 3] s = list("abc") m = Text(valign="bottom", fontsize=5, offset=.1) > p = Plot(x, y, text=s).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:108: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_kws={}, text=<''>, color=<'k'>, alpha=<1>, fontsize=5, halign=<'center'>, valign='bottom', offset=0.1), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestText.test_offset_halign __________________________ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_kws={}, text=<''>, color=<'k'>, alpha=<1>, fontsize=10, halign='right', valign=<'center_baseline'>, offset=0.5), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: > self._scales[var] = scale._setup(var_df[var], prop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:1376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/scales.py:431: in _setup mpl_scale = new._get_scale(str(data.name), forward, inverse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ seaborn/_core/scales.py:96: in _get_scale return InternalScale(name, (forward, inverse)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = .InternalScale object at 0x7f3fd702e120> args = ('x', (.identity at 0x7f3fd4f76f00>, .identity at 0x7f3fd4f76f00>)) kwargs = {}, axis = None @wraps(init_func) def wrapper(self, *args, **kwargs): if args and isinstance(args[0], mpl.axis.Axis): return init_func(self, *args, **kwargs) else: # Remove 'axis' from kwargs to avoid double assignment axis = kwargs.pop('axis', None) > return init_func(self, axis, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E TypeError: FuncScale.__init__() takes 3 positional arguments but 4 were given /usr/lib64/python3.14/site-packages/matplotlib/scale.py:178: TypeError The above exception was the direct cause of the following exception: self = def test_offset_halign(self): x = y = [1, 2, 3] s = list("abc") m = Text(halign="right", fontsize=10, offset=.5) > p = Plot(x, y, text=s).add(m).plot() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/_marks/test_text.py:122: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ seaborn/_core/plot.py:932: in plot return self._plot(pyplot) ^^^^^^^^^^^^^^^^^^ seaborn/_core/plot.py:947: in _plot plotter._setup_scales(self, common, layers, coord_vars) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = p = common = layers = [{'data': , 'label': None, 'legend': True, 'mark': Text(artist_kws={}, text=<''>, color=<'k'>, alpha=<1>, fontsize=10, halign='right', valign=<'center_baseline'>, offset=0.5), ...}] variables = ['x', 'y'] def _setup_scales( self, p: Plot, common: PlotData, layers: list[Layer], variables: list[str] | None = None, ) -> None: if variables is None: # Add variables that have data but not a scale, which happens # because this method can be called multiple time, to handle # variables added during the Stat transform. variables = [] for layer in layers: variables.extend(layer["data"].frame.columns) for df in layer["data"].frames.values(): variables.extend(str(v) for v in df if v not in variables) variables = [v for v in variables if v not in self._scales] for var in variables: # Determine whether this is a coordinate variable # (i.e., x/y, paired x/y, or derivative such as xmax) m = re.match(r"^(?P(?Px|y)\d*).*", var) if m is None: coord = axis = None else: coord = m["coord"] axis = m["axis"] # Get keys that handle things like x0, xmax, properly where relevant prop_key = var if axis is None else axis scale_key = var if coord is None else coord if prop_key not in PROPERTIES: continue # Concatenate layers, using only the relevant coordinate and faceting vars, # This is unnecessarily wasteful, as layer data will often be redundant. # But figuring out the minimal amount we need is more complicated. cols = [var, "col", "row"] parts = [common.frame.filter(cols)] for layer in layers: parts.append(layer["data"].frame.filter(cols)) for df in layer["data"].frames.values(): parts.append(df.filter(cols)) var_df = pd.concat(parts, ignore_index=True) prop = PROPERTIES[prop_key] scale = self._get_scale(p, scale_key, prop, var_df[var]) if scale_key not in p._variables: # TODO this implies that the variable was added by the stat # It allows downstream orientation inference to work properly. # But it feels rather hacky, so ideally revisit. scale._priority = 0 # type: ignore if axis is None: # We could think about having a broader concept of (un)shared properties # In general, not something you want to do (different scales in facets) # But could make sense e.g. with paired plots. Build later. share_state = None subplots = [] else: share_state = self._subplots.subplot_spec[f"share{axis}"] subplots = [view for view in self._subplots if view[axis] == coord] if scale is None: self._scales[var] = Scale._identity() else: try: self._scales[var] = scale._setup(var_df[var], prop) except Exception as err: > raise PlotSpecError._during("Scale setup", var) from err E seaborn._core.exceptions.PlotSpecError: Scale setup failed for the `x` variable. See the traceback above for more information. seaborn/_core/plot.py:1378: PlotSpecError _________________________ TestStripPlot.test_log_scale _________________________ self = def test_log_scale(self): x = [1, 10, 100, 1000] ax = plt.figure().subplots() ax.set_xscale("log") self.func(x=x) vals = ax.collections[0].get_offsets()[:, 0] > assert_array_equal(x, vals) E AssertionError: E Arrays are not equal E E Mismatched elements: 3 / 4 (75%) E Mismatch at indices: E [1]: 10 (ACTUAL), 10.000000000000002 (DESIRED) E [2]: 100 (ACTUAL), 100.00000000000004 (DESIRED) E [3]: 1000 (ACTUAL), 1000.0000000000007 (DESIRED) E Max absolute difference among violations: 6.82121026e-13 E Max relative difference among violations: 6.82121026e-16 E ACTUAL: array([ 1, 10, 100, 1000]) E DESIRED: MaskedArray([1.e+00, 1.e+01, 1.e+02, 1.e+03]) tests/test_categorical.py:667: AssertionError _________________________ TestSwarmPlot.test_log_scale _________________________ self = def test_log_scale(self): x = [1, 10, 100, 1000] ax = plt.figure().subplots() ax.set_xscale("log") self.func(x=x) vals = ax.collections[0].get_offsets()[:, 0] > assert_array_equal(x, vals) E AssertionError: E Arrays are not equal E E Mismatched elements: 3 / 4 (75%) E Mismatch at indices: E [1]: 10 (ACTUAL), 10.000000000000002 (DESIRED) E [2]: 100 (ACTUAL), 100.00000000000004 (DESIRED) E [3]: 1000 (ACTUAL), 1000.0000000000007 (DESIRED) E Max absolute difference among violations: 6.82121026e-13 E Max relative difference among violations: 6.82121026e-16 E ACTUAL: array([ 1, 10, 100, 1000]) E DESIRED: MaskedArray([1.e+00, 1.e+01, 1.e+02, 1.e+03]) tests/test_categorical.py:667: AssertionError _____________________ TestKDEPlotBivariate.test_log_scale ______________________ self = rng = RandomState(MT19937) at 0x7F3FD64E0740 def test_log_scale(self, rng): x = rng.lognormal(0, 1, 100) y = rng.uniform(0, 1, 100) levels = .2, .5, 1 f, ax = plt.subplots() kdeplot(x=x, y=y, log_scale=True, levels=levels, ax=ax) assert ax.get_xscale() == "log" assert ax.get_yscale() == "log" f, (ax1, ax2) = plt.subplots(ncols=2) kdeplot(x=x, y=y, log_scale=(10, False), levels=levels, ax=ax1) assert ax1.get_xscale() == "log" assert ax1.get_yscale() == "linear" p = _DistributionPlotter() kde = KDE() density, (xx, yy) = kde(np.log10(x), y) levels = p._quantile_to_level(density, levels) ax2.contour(10 ** xx, yy, density, levels=levels) for c1, c2 in zip(ax1.collections, ax2.collections): assert len(get_contour_coords(c1)) == len(get_contour_coords(c2)) for arr1, arr2 in zip(get_contour_coords(c1), get_contour_coords(c2)): > assert_array_equal(arr1, arr2) E AssertionError: E Arrays are not equal E E Mismatched elements: 242 / 998 (24.2%) E First 5 mismatches are at indices: E [6, 0]: 1.0120830563623893 (ACTUAL), 1.012083056362389 (DESIRED) E [10, 0]: 1.1276980041383862 (ACTUAL), 1.127698004138386 (DESIRED) E [11, 0]: 1.169100099965086 (ACTUAL), 1.1691000999650858 (DESIRED) E [24, 0]: 1.6766428303869279 (ACTUAL), 1.6766428303869276 (DESIRED) E [25, 0]: 1.71467549468341 (ACTUAL), 1.7146754946834097 (DESIRED) E Max absolute difference among violations: 1.77635684e-15 E Max relative difference among violations: 3.90883405e-16 E ACTUAL: array([[ 8.151975e-01, -7.489397e-02], E [ 8.451265e-01, -7.592330e-02], E [ 8.761543e-01, -7.659791e-02],... E DESIRED: array([[ 8.151975e-01, -7.489397e-02], E [ 8.451265e-01, -7.592330e-02], E [ 8.761543e-01, -7.659791e-02],... tests/test_distributions.py:1020: AssertionError ________________________ TestLinePlotter.test_log_scale ________________________ self = def test_log_scale(self): f, ax = plt.subplots() ax.set_xscale("log") x = [1, 10, 100] y = [1, 2, 3] lineplot(x=x, y=y) line = ax.lines[0] > assert_array_equal(line.get_xdata(), x) E AssertionError: E Arrays are not equal E E Mismatched elements: 2 / 3 (66.7%) E Mismatch at indices: E [1]: 10.000000000000002 (ACTUAL), 10 (DESIRED) E [2]: 100.00000000000004 (ACTUAL), 100 (DESIRED) E Max absolute difference among violations: 4.26325641e-14 E Max relative difference among violations: 4.26325641e-16 E ACTUAL: array([ 1., 10., 100.]) E DESIRED: array([ 1, 10, 100]) tests/test_relational.py:1124: AssertionError =============================== warnings summary =============================== tests/test_base.py::TestSizeMapping::test_array_palette_deprecation /usr/lib/python3.14/site-packages/pluggy/_callers.py:121: UserWarning: The palette list has fewer values (2) than needed (3) and will cycle, which may produce an uninterpretable plot. res = hook_impl.function(*args) tests/test_base.py::TestVectorPlotter::test_attach_converters /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_base.py:1133: MatplotlibDeprecationWarning: The converter attribute was deprecated in Matplotlib 3.10 and will be removed in 3.12. Use get_converter and set_converter methods instead. assert ax.xaxis.converter is None tests/test_base.py::TestVectorPlotter::test_attach_converters /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_base.py:1134: MatplotlibDeprecationWarning: The converter attribute was deprecated in Matplotlib 3.10 and will be removed in 3.12. Use get_converter and set_converter methods instead. assert "Date" in ax.yaxis.converter.__class__.__name__ tests/test_base.py::TestVectorPlotter::test_attach_converters /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_base.py:1139: MatplotlibDeprecationWarning: The converter attribute was deprecated in Matplotlib 3.10 and will be removed in 3.12. Use get_converter and set_converter methods instead. assert "CategoryConverter" in ax.xaxis.converter.__class__.__name__ tests/test_base.py::TestVectorPlotter::test_attach_converters /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_base.py:1140: MatplotlibDeprecationWarning: The converter attribute was deprecated in Matplotlib 3.10 and will be removed in 3.12. Use get_converter and set_converter methods instead. assert ax.yaxis.converter is None tests/test_categorical.py: 168 warnings /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/seaborn/categorical.py:700: MatplotlibDeprecationWarning: vert: bool was deprecated in Matplotlib 3.11 and will be removed in 3.13. Use orientation: {'vertical', 'horizontal'} instead. artists = ax.bxp(**boxplot_kws) tests/test_matrix.py::TestHeatmap::test_custom_diverging_vlims tests/test_matrix.py::TestHeatmap::test_centered_vlims tests/test_matrix.py::TestHeatmap::test_custom_center_colors tests/test_matrix.py::TestHeatmap::test_cmap_with_properties tests/test_matrix.py::TestHeatmap::test_cmap_with_properties tests/test_matrix.py::TestHeatmap::test_cmap_with_properties /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/seaborn/matrix.py:243: PendingDeprecationWarning: The set_bad function will be deprecated in a future version. Use cmap.with_extremes(bad=...) or Colormap(bad=...) instead. self.cmap.set_bad(bad) tests/test_matrix.py::TestHeatmap::test_cmap_with_properties /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_matrix.py:230: PendingDeprecationWarning: The set_bad function will be deprecated in a future version. Use cmap.with_extremes(bad=...) or Colormap(bad=...) instead. cmap.set_bad("red") tests/test_matrix.py::TestHeatmap::test_cmap_with_properties /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_matrix.py:245: PendingDeprecationWarning: The set_under function will be deprecated in a future version. Use cmap.with_extremes(under=...) or Colormap(under=...) instead. cmap.set_under("red") tests/test_matrix.py::TestHeatmap::test_cmap_with_properties /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/seaborn/matrix.py:245: PendingDeprecationWarning: The set_under function will be deprecated in a future version. Use cmap.with_extremes(under=...) or Colormap(under=...) instead. self.cmap.set_under(under) tests/test_matrix.py::TestHeatmap::test_cmap_with_properties /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/tests/test_matrix.py:256: PendingDeprecationWarning: The set_over function will be deprecated in a future version. Use cmap.with_extremes(over=...) or Colormap(over=...) instead. cmap.set_over("red") tests/test_matrix.py::TestHeatmap::test_cmap_with_properties /builddir/build/BUILD/python-seaborn-0.13.2-build/seaborn-0.13.2/seaborn/matrix.py:247: PendingDeprecationWarning: The set_over function will be deprecated in a future version. Use cmap.with_extremes(over=...) or Colormap(over=...) instead. self.cmap.set_over(over) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED tests/_core/test_plot.py::TestLayerAddition::test_without_data - seabo... FAILED tests/_core/test_plot.py::TestLayerAddition::test_with_new_variable_by_name FAILED tests/_core/test_plot.py::TestLayerAddition::test_with_new_variable_by_vector FAILED tests/_core/test_plot.py::TestLayerAddition::test_with_late_data_definition FAILED tests/_core/test_plot.py::TestLayerAddition::test_with_new_data_definition FAILED tests/_core/test_plot.py::TestLayerAddition::test_drop_variable - seab... FAILED tests/_core/test_plot.py::TestLayerAddition::test_orient[x-x] - seabor... FAILED tests/_core/test_plot.py::TestLayerAddition::test_orient[y-y] - seabor... FAILED tests/_core/test_plot.py::TestLayerAddition::test_orient[v-x] - seabor... FAILED tests/_core/test_plot.py::TestLayerAddition::test_orient[h-y] - seabor... FAILED tests/_core/test_plot.py::TestScaling::test_inference - seaborn._core.... FAILED tests/_core/test_plot.py::TestScaling::test_inference_from_layer_data FAILED tests/_core/test_plot.py::TestScaling::test_inference_joins - seaborn.... FAILED tests/_core/test_plot.py::TestScaling::test_inferred_categorical_converter FAILED tests/_core/test_plot.py::TestScaling::test_explicit_categorical_converter FAILED tests/_core/test_plot.py::TestScaling::test_faceted_log_scale - seabor... FAILED tests/_core/test_plot.py::TestScaling::test_paired_single_log_scale - ... FAILED tests/_core/test_plot.py::TestScaling::test_paired_with_common_fallback FAILED tests/_core/test_plot.py::TestScaling::test_mark_data_log_transform_is_inverted FAILED tests/_core/test_plot.py::TestScaling::test_mark_data_log_transfrom_with_stat FAILED tests/_core/test_plot.py::TestScaling::test_mark_data_from_categorical FAILED tests/_core/test_plot.py::TestScaling::test_mark_data_from_datetime - ... FAILED tests/_core/test_plot.py::TestScaling::test_computed_var_ticks - seabo... FAILED tests/_core/test_plot.py::TestScaling::test_computed_var_transform - s... FAILED tests/_core/test_plot.py::TestScaling::test_explicit_range_with_axis_scaling FAILED tests/_core/test_plot.py::TestScaling::test_derived_range_with_axis_scaling FAILED tests/_core/test_plot.py::TestScaling::test_facet_categories - seaborn... FAILED tests/_core/test_plot.py::TestScaling::test_facet_categories_unshared FAILED tests/_core/test_plot.py::TestScaling::test_facet_categories_single_dim_shared FAILED tests/_core/test_plot.py::TestScaling::test_pair_categories - seaborn.... FAILED tests/_core/test_plot.py::TestScaling::test_pair_categories_shared - s... FAILED tests/_core/test_plot.py::TestScaling::test_identity_mapping_linewidth FAILED tests/_core/test_plot.py::TestScaling::test_pair_single_coordinate_stat_orient FAILED tests/_core/test_plot.py::TestScaling::test_inferred_nominal_passed_to_stat FAILED tests/_core/test_plot.py::TestScaling::test_identity_mapping_color_tuples FAILED tests/_core/test_plot.py::TestScaling::test_nominal_x_axis_tweaks - se... FAILED tests/_core/test_plot.py::TestScaling::test_nominal_y_axis_tweaks - se... FAILED tests/_core/test_plot.py::TestPlotting::test_no_orient_variance - seab... FAILED tests/_core/test_plot.py::TestPlotting::test_single_split_single_layer FAILED tests/_core/test_plot.py::TestPlotting::test_single_split_multi_layer FAILED tests/_core/test_plot.py::TestPlotting::test_one_grouping_variable[color] FAILED tests/_core/test_plot.py::TestPlotting::test_one_grouping_variable[group] FAILED tests/_core/test_plot.py::TestPlotting::test_two_grouping_variables - ... FAILED tests/_core/test_plot.py::TestPlotting::test_specified_width - seaborn... FAILED tests/_core/test_plot.py::TestPlotting::test_facets_no_subgroups - sea... FAILED tests/_core/test_plot.py::TestPlotting::test_facets_one_subgroup - sea... FAILED tests/_core/test_plot.py::TestPlotting::test_layer_specific_facet_disabling FAILED tests/_core/test_plot.py::TestPlotting::test_paired_variables - seabor... FAILED tests/_core/test_plot.py::TestPlotting::test_paired_one_dimension - se... FAILED tests/_core/test_plot.py::TestPlotting::test_paired_variables_one_subset FAILED tests/_core/test_plot.py::TestPlotting::test_paired_and_faceted - seab... FAILED tests/_core/test_plot.py::TestPlotting::test_theme_validation - Assert... FAILED tests/_core/test_plot.py::TestPlotting::test_stat - seaborn._core.exce... FAILED tests/_core/test_plot.py::TestPlotting::test_move - seaborn._core.exce... FAILED tests/_core/test_plot.py::TestPlotting::test_stat_and_move - seaborn._... FAILED tests/_core/test_plot.py::TestPlotting::test_stat_log_scale - seaborn.... FAILED tests/_core/test_plot.py::TestPlotting::test_move_log_scale - seaborn.... FAILED tests/_core/test_plot.py::TestPlotting::test_multi_move - seaborn._cor... FAILED tests/_core/test_plot.py::TestPlotting::test_multi_move_with_pairing FAILED tests/_core/test_plot.py::TestPlotting::test_move_with_range - seaborn... FAILED tests/_core/test_plot.py::TestPlotting::test_on_axes - seaborn._core.e... FAILED tests/_core/test_plot.py::TestPlotting::test_on_figure[True] - seaborn... FAILED tests/_core/test_plot.py::TestPlotting::test_on_figure[False] - seabor... FAILED tests/_core/test_plot.py::TestPlotting::test_on_subfigure[True] - seab... FAILED tests/_core/test_plot.py::TestPlotting::test_on_subfigure[False] - sea... FAILED tests/_core/test_plot.py::TestPlotting::test_axis_labels_from_constructor FAILED tests/_core/test_plot.py::TestPlotting::test_axis_labels_from_layer - ... FAILED tests/_core/test_plot.py::TestPlotting::test_axis_labels_are_first_name FAILED tests/_core/test_plot.py::TestPlotting::test_limits - seaborn._core.ex... FAILED tests/_core/test_plot.py::TestPlotting::test_labels_axis - seaborn._co... FAILED tests/_core/test_plot.py::TestPlotting::test_labels_legend - seaborn._... FAILED tests/_core/test_plot.py::TestExceptions::test_scale_setup - Assertion... FAILED tests/_core/test_plot.py::TestExceptions::test_coordinate_scaling - As... FAILED tests/_core/test_plot.py::TestExceptions::test_semantic_scaling - Asse... FAILED tests/_core/test_plot.py::TestFacetInterface::test_unshared_spacing - ... FAILED tests/_core/test_plot.py::TestPairInterface::test_all_numeric[list] - ... FAILED tests/_core/test_plot.py::TestPairInterface::test_all_numeric[Index] FAILED tests/_core/test_plot.py::TestPairInterface::test_single_dimension[x] FAILED tests/_core/test_plot.py::TestPairInterface::test_single_dimension[y] FAILED tests/_core/test_plot.py::TestPairInterface::test_non_cross - seaborn.... FAILED tests/_core/test_plot.py::TestPairInterface::test_list_of_vectors - se... FAILED tests/_core/test_plot.py::TestPairInterface::test_with_facets - seabor... FAILED tests/_core/test_plot.py::TestPairInterface::test_axis_sharing - seabo... FAILED tests/_core/test_plot.py::TestPairInterface::test_axis_sharing_with_facets FAILED tests/_core/test_plot.py::TestPairInterface::test_x_wrapping - seaborn... FAILED tests/_core/test_plot.py::TestPairInterface::test_y_wrapping - seaborn... FAILED tests/_core/test_plot.py::TestPairInterface::test_non_cross_wrapping FAILED tests/_core/test_plot.py::TestPairInterface::test_orient_inference - s... FAILED tests/_core/test_plot.py::TestPairInterface::test_computed_coordinate_orient_inference FAILED tests/_core/test_plot.py::TestPairInterface::test_limits - seaborn._co... FAILED tests/_core/test_plot.py::TestPairInterface::test_labels - seaborn._co... FAILED tests/_core/test_plot.py::TestLabelVisibility::test_single_subplot - s... FAILED tests/_core/test_plot.py::TestLabelVisibility::test_1d_column[facet_kws0-pair_kws0] FAILED tests/_core/test_plot.py::TestLabelVisibility::test_1d_column[facet_kws1-pair_kws1] FAILED tests/_core/test_plot.py::TestLabelVisibility::test_1d_row[facet_kws0-pair_kws0] FAILED tests/_core/test_plot.py::TestLabelVisibility::test_1d_row[facet_kws1-pair_kws1] FAILED tests/_core/test_plot.py::TestLegend::test_single_layer_single_variable FAILED tests/_core/test_plot.py::TestLegend::test_single_layer_common_variable FAILED tests/_core/test_plot.py::TestLegend::test_single_layer_common_unnamed_variable FAILED tests/_core/test_plot.py::TestLegend::test_single_layer_multi_variable FAILED tests/_core/test_plot.py::TestLegend::test_multi_layer_single_variable FAILED tests/_core/test_plot.py::TestLegend::test_multi_layer_multi_variable FAILED tests/_core/test_plot.py::TestLegend::test_multi_layer_different_artists FAILED tests/_core/test_plot.py::TestLegend::test_three_layers - seaborn._cor... FAILED tests/_core/test_plot.py::TestLegend::test_identity_scale_ignored - se... FAILED tests/_core/test_plot.py::TestLegend::test_suppression_in_add_method FAILED tests/_core/test_plot.py::TestLegend::test_anonymous_title - seaborn._... FAILED tests/_core/test_plot.py::TestLegend::test_legendless_mark - seaborn._... FAILED tests/_core/test_plot.py::TestLegend::test_legend_has_no_offset - seab... FAILED tests/_core/test_plot.py::TestLegend::test_layer_legend - seaborn._cor... FAILED tests/_core/test_plot.py::TestLegend::test_layer_legend_with_scale_legend FAILED tests/_core/test_plot.py::TestLegend::test_layer_legend_title - seabor... FAILED tests/_core/test_scales.py::TestContinuous::test_coordinate_defaults FAILED tests/_core/test_scales.py::TestContinuous::test_coordinate_transform FAILED tests/_core/test_scales.py::TestContinuous::test_coordinate_transform_with_parameter FAILED tests/_core/test_scales.py::TestContinuous::test_interval_defaults - T... FAILED tests/_core/test_scales.py::TestContinuous::test_interval_with_range FAILED tests/_core/test_scales.py::TestContinuous::test_interval_with_norm - ... FAILED tests/_core/test_scales.py::TestContinuous::test_interval_with_range_norm_and_transform FAILED tests/_core/test_scales.py::TestContinuous::test_interval_with_bools FAILED tests/_core/test_scales.py::TestContinuous::test_color_defaults - Type... FAILED tests/_core/test_scales.py::TestContinuous::test_color_named_values - ... FAILED tests/_core/test_scales.py::TestContinuous::test_color_tuple_values - ... FAILED tests/_core/test_scales.py::TestContinuous::test_color_callable_values FAILED tests/_core/test_scales.py::TestContinuous::test_color_with_norm - Typ... FAILED tests/_core/test_scales.py::TestContinuous::test_color_with_transform FAILED tests/_core/test_scales.py::TestContinuous::test_tick_locator - TypeEr... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_upto - TypeError... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_every - TypeErro... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_every_between - ... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_at - TypeError: ... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_count - TypeErro... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_count_between - ... FAILED tests/_core/test_scales.py::TestContinuous::test_tick_minor - TypeErro... FAILED tests/_core/test_scales.py::TestContinuous::test_log_tick_default - Ty... FAILED tests/_core/test_scales.py::TestContinuous::test_log_tick_upto - TypeE... FAILED tests/_core/test_scales.py::TestContinuous::test_log_tick_count - Type... FAILED tests/_core/test_scales.py::TestContinuous::test_log_tick_format_disabled FAILED tests/_core/test_scales.py::TestContinuous::test_symlog_tick_default FAILED tests/_core/test_scales.py::TestContinuous::test_label_formatter - Typ... FAILED tests/_core/test_scales.py::TestContinuous::test_label_like_pattern - ... FAILED tests/_core/test_scales.py::TestContinuous::test_label_like_string - T... FAILED tests/_core/test_scales.py::TestContinuous::test_label_like_function FAILED tests/_core/test_scales.py::TestContinuous::test_label_base - TypeErro... FAILED tests/_core/test_scales.py::TestContinuous::test_label_unit - TypeErro... FAILED tests/_core/test_scales.py::TestContinuous::test_label_unit_with_sep FAILED tests/_core/test_scales.py::TestContinuous::test_label_empty_unit - Ty... FAILED tests/_core/test_scales.py::TestContinuous::test_label_base_from_transform FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_defaults - Ty... FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_with_order - ... FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_with_subset_order FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_axis - TypeEr... FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_axis_with_order FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_axis_with_subset_order FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_axis_with_category_dtype FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_numeric_data FAILED tests/_core/test_scales.py::TestNominal::test_coordinate_numeric_data_with_order FAILED tests/_core/test_scales.py::TestNominal::test_color_defaults - TypeErr... FAILED tests/_core/test_scales.py::TestNominal::test_color_named_palette - Ty... FAILED tests/_core/test_scales.py::TestNominal::test_color_list_palette - Typ... FAILED tests/_core/test_scales.py::TestNominal::test_color_dict_palette - Typ... FAILED tests/_core/test_scales.py::TestNominal::test_color_numeric_data - Typ... FAILED tests/_core/test_scales.py::TestNominal::test_color_numeric_with_order_subset FAILED tests/_core/test_scales.py::TestNominal::test_color_alpha_in_palette FAILED tests/_core/test_scales.py::TestNominal::test_color_unknown_palette - ... FAILED tests/_core/test_scales.py::TestNominal::test_object_defaults - TypeEr... FAILED tests/_core/test_scales.py::TestNominal::test_object_list - TypeError:... FAILED tests/_core/test_scales.py::TestNominal::test_object_dict - TypeError:... FAILED tests/_core/test_scales.py::TestNominal::test_object_order - TypeError... FAILED tests/_core/test_scales.py::TestNominal::test_object_order_subset - Ty... FAILED tests/_core/test_scales.py::TestNominal::test_objects_that_are_weird FAILED tests/_core/test_scales.py::TestNominal::test_alpha_default - TypeErro... FAILED tests/_core/test_scales.py::TestNominal::test_fill - TypeError: Linear... FAILED tests/_core/test_scales.py::TestNominal::test_fill_dict - TypeError: L... FAILED tests/_core/test_scales.py::TestNominal::test_fill_nunique_warning - F... FAILED tests/_core/test_scales.py::TestNominal::test_interval_defaults - Type... FAILED tests/_core/test_scales.py::TestNominal::test_interval_tuple - TypeErr... FAILED tests/_core/test_scales.py::TestNominal::test_interval_tuple_numeric FAILED tests/_core/test_scales.py::TestNominal::test_interval_list - TypeErro... FAILED tests/_core/test_scales.py::TestNominal::test_interval_dict - TypeErro... FAILED tests/_core/test_scales.py::TestNominal::test_interval_with_transform FAILED tests/_core/test_scales.py::TestNominal::test_empty_data - TypeError: ... FAILED tests/_core/test_scales.py::TestNominal::test_finalize - TypeError: Li... FAILED tests/_core/test_scales.py::TestTemporal::test_coordinate_defaults - T... FAILED tests/_core/test_scales.py::TestTemporal::test_interval_defaults - Typ... FAILED tests/_core/test_scales.py::TestTemporal::test_interval_with_range - T... FAILED tests/_core/test_scales.py::TestTemporal::test_interval_with_norm - Ty... FAILED tests/_core/test_scales.py::TestTemporal::test_color_defaults - TypeEr... FAILED tests/_core/test_scales.py::TestTemporal::test_color_named_values - Ty... FAILED tests/_core/test_scales.py::TestTemporal::test_coordinate_axis - TypeE... FAILED tests/_core/test_scales.py::TestTemporal::test_tick_locator - TypeErro... FAILED tests/_core/test_scales.py::TestTemporal::test_tick_upto - TypeError: ... FAILED tests/_core/test_scales.py::TestTemporal::test_label_formatter - TypeE... FAILED tests/_core/test_scales.py::TestTemporal::test_label_concise - TypeErr... FAILED tests/_core/test_scales.py::TestBoolean::test_coordinate - TypeError: ... FAILED tests/_core/test_scales.py::TestBoolean::test_coordinate_axis - TypeEr... FAILED tests/_core/test_scales.py::TestBoolean::test_coordinate_missing[object-nan] FAILED tests/_core/test_scales.py::TestBoolean::test_coordinate_missing[object-None] FAILED tests/_core/test_scales.py::TestBoolean::test_coordinate_missing[boolean-value2] FAILED tests/_core/test_scales.py::TestBoolean::test_color_defaults - TypeErr... FAILED tests/_core/test_scales.py::TestBoolean::test_color_list_palette - Typ... FAILED tests/_core/test_scales.py::TestBoolean::test_color_tuple_palette - Ty... FAILED tests/_core/test_scales.py::TestBoolean::test_color_dict_palette - Typ... FAILED tests/_core/test_scales.py::TestBoolean::test_object_defaults - TypeEr... FAILED tests/_core/test_scales.py::TestBoolean::test_object_list - TypeError:... FAILED tests/_core/test_scales.py::TestBoolean::test_object_dict - TypeError:... FAILED tests/_core/test_scales.py::TestBoolean::test_fill - TypeError: FuncSc... FAILED tests/_core/test_scales.py::TestBoolean::test_interval_defaults - Type... FAILED tests/_core/test_scales.py::TestBoolean::test_interval_tuple - TypeErr... FAILED tests/_core/test_scales.py::TestBoolean::test_finalize - TypeError: Fu... FAILED tests/_marks/test_area.py::TestArea::test_single_defaults - seaborn._c... FAILED tests/_marks/test_area.py::TestArea::test_set_properties - seaborn._co... FAILED tests/_marks/test_area.py::TestArea::test_mapped_properties - seaborn.... FAILED tests/_marks/test_area.py::TestArea::test_unfilled - seaborn._core.exc... FAILED tests/_marks/test_area.py::TestBand::test_range - seaborn._core.except... FAILED tests/_marks/test_area.py::TestBand::test_auto_range - seaborn._core.e... FAILED tests/_marks/test_bar.py::TestBar::test_categorical_positions_vertical FAILED tests/_marks/test_bar.py::TestBar::test_categorical_positions_horizontal FAILED tests/_marks/test_bar.py::TestBar::test_numeric_positions_vertical - s... FAILED tests/_marks/test_bar.py::TestBar::test_numeric_positions_horizontal FAILED tests/_marks/test_bar.py::TestBar::test_set_properties - seaborn._core... FAILED tests/_marks/test_bar.py::TestBar::test_mapped_properties - seaborn._c... FAILED tests/_marks/test_bar.py::TestBar::test_zero_height_skipped - seaborn.... FAILED tests/_marks/test_bar.py::TestBar::test_artist_kws_clip - seaborn._cor... FAILED tests/_marks/test_bar.py::TestBars::test_positions - seaborn._core.exc... FAILED tests/_marks/test_bar.py::TestBars::test_positions_horizontal - seabor... FAILED tests/_marks/test_bar.py::TestBars::test_width - seaborn._core.excepti... FAILED tests/_marks/test_bar.py::TestBars::test_mapped_color_direct_alpha - s... FAILED tests/_marks/test_bar.py::TestBars::test_mapped_edgewidth - seaborn._c... FAILED tests/_marks/test_bar.py::TestBars::test_auto_edgewidth - seaborn._cor... FAILED tests/_marks/test_bar.py::TestBars::test_unfilled - seaborn._core.exce... FAILED tests/_marks/test_bar.py::TestBars::test_log_scale - seaborn._core.exc... FAILED tests/_marks/test_dot.py::TestDot::test_simple - seaborn._core.excepti... FAILED tests/_marks/test_dot.py::TestDot::test_filled_unfilled_mix - seaborn.... FAILED tests/_marks/test_dot.py::TestDot::test_missing_coordinate_data - seab... FAILED tests/_marks/test_dot.py::TestDot::test_missing_semantic_data[color] FAILED tests/_marks/test_dot.py::TestDot::test_missing_semantic_data[fill] - ... FAILED tests/_marks/test_dot.py::TestDot::test_missing_semantic_data[marker] FAILED tests/_marks/test_dot.py::TestDot::test_missing_semantic_data[pointsize] FAILED tests/_marks/test_dot.py::TestDots::test_simple - seaborn._core.except... FAILED tests/_marks/test_dot.py::TestDots::test_set_color - seaborn._core.exc... FAILED tests/_marks/test_dot.py::TestDots::test_map_color - seaborn._core.exc... FAILED tests/_marks/test_dot.py::TestDots::test_fill - seaborn._core.exceptio... FAILED tests/_marks/test_dot.py::TestDots::test_pointsize - seaborn._core.exc... FAILED tests/_marks/test_dot.py::TestDots::test_stroke - seaborn._core.except... FAILED tests/_marks/test_dot.py::TestDots::test_filled_unfilled_mix - seaborn... FAILED tests/_marks/test_line.py::TestPath::test_xy_data - seaborn._core.exce... FAILED tests/_marks/test_line.py::TestPath::test_shared_colors_direct - seabo... FAILED tests/_marks/test_line.py::TestPath::test_separate_colors_direct - sea... FAILED tests/_marks/test_line.py::TestPath::test_shared_colors_mapped - seabo... FAILED tests/_marks/test_line.py::TestPath::test_separate_colors_mapped - sea... FAILED tests/_marks/test_line.py::TestPath::test_color_with_alpha - seaborn._... FAILED tests/_marks/test_line.py::TestPath::test_color_and_alpha - seaborn._c... FAILED tests/_marks/test_line.py::TestPath::test_other_props_direct - seaborn... FAILED tests/_marks/test_line.py::TestPath::test_other_props_mapped - seaborn... FAILED tests/_marks/test_line.py::TestPath::test_capstyle - seaborn._core.exc... FAILED tests/_marks/test_line.py::TestLine::test_xy_data - seaborn._core.exce... FAILED tests/_marks/test_line.py::TestPaths::test_xy_data - seaborn._core.exc... FAILED tests/_marks/test_line.py::TestPaths::test_set_properties - seaborn._c... FAILED tests/_marks/test_line.py::TestPaths::test_mapped_properties - seaborn... FAILED tests/_marks/test_line.py::TestPaths::test_color_with_alpha - seaborn.... FAILED tests/_marks/test_line.py::TestPaths::test_color_and_alpha - seaborn._... FAILED tests/_marks/test_line.py::TestPaths::test_capstyle - seaborn._core.ex... FAILED tests/_marks/test_line.py::TestLines::test_xy_data - seaborn._core.exc... FAILED tests/_marks/test_line.py::TestLines::test_single_orient_value - seabo... FAILED tests/_marks/test_line.py::TestRange::test_xy_data - seaborn._core.exc... FAILED tests/_marks/test_line.py::TestRange::test_auto_range - seaborn._core.... FAILED tests/_marks/test_line.py::TestRange::test_mapped_color - seaborn._cor... FAILED tests/_marks/test_line.py::TestRange::test_direct_properties - seaborn... FAILED tests/_marks/test_line.py::TestDash::test_xy_data - seaborn._core.exce... FAILED tests/_marks/test_line.py::TestDash::test_xy_data_grouped - seaborn._c... FAILED tests/_marks/test_line.py::TestDash::test_set_properties - seaborn._co... FAILED tests/_marks/test_line.py::TestDash::test_mapped_properties - seaborn.... FAILED tests/_marks/test_line.py::TestDash::test_width - seaborn._core.except... FAILED tests/_marks/test_line.py::TestDash::test_dodge - seaborn._core.except... FAILED tests/_marks/test_text.py::TestText::test_simple - seaborn._core.excep... FAILED tests/_marks/test_text.py::TestText::test_set_properties - seaborn._co... FAILED tests/_marks/test_text.py::TestText::test_mapped_properties - seaborn.... FAILED tests/_marks/test_text.py::TestText::test_mapped_alignment - seaborn._... FAILED tests/_marks/test_text.py::TestText::test_identity_fontsize - seaborn.... FAILED tests/_marks/test_text.py::TestText::test_offset_centered - seaborn._c... FAILED tests/_marks/test_text.py::TestText::test_offset_valign - seaborn._cor... FAILED tests/_marks/test_text.py::TestText::test_offset_halign - seaborn._cor... FAILED tests/test_categorical.py::TestStripPlot::test_log_scale - AssertionEr... FAILED tests/test_categorical.py::TestSwarmPlot::test_log_scale - AssertionEr... FAILED tests/test_distributions.py::TestKDEPlotBivariate::test_log_scale - As... FAILED tests/test_relational.py::TestLinePlotter::test_log_scale - AssertionE... = 287 failed, 2067 passed, 16 skipped, 4 deselected, 6 xfailed, 184 warnings in 613.24s (0:10:13) = RPM build errors: error: Bad exit status from /var/tmp/rpm-tmp.s4cFI3 (%check) Bad exit status from /var/tmp/rpm-tmp.s4cFI3 (%check) Finish: rpmbuild python-seaborn-0.13.2-18.fc45.src.rpm Finish: build phase for python-seaborn-0.13.2-18.fc45.src.rpm INFO: chroot_scan: 1 files copied to /var/lib/copr-rpmbuild/results/chroot_scan INFO: /var/lib/mock/fedora-rawhide-x86_64-1777362911.442062/root/var/log/dnf5.log INFO: chroot_scan: creating tarball /var/lib/copr-rpmbuild/results/chroot_scan.tar.gz /bin/tar: Removing leading `/' from member names ERROR: Exception(/var/lib/copr-rpmbuild/results/python-seaborn-0.13.2-18.fc45.src.rpm) Config(fedora-rawhide-x86_64) 10 minutes 54 seconds INFO: Results and/or logs in: /var/lib/copr-rpmbuild/results INFO: Cleaning up build root ('cleanup_on_failure=True') Start: clean chroot INFO: unmounting tmpfs. Finish: clean chroot ERROR: Command failed: # /usr/bin/systemd-nspawn -q -M 0e1d296527414f99b174fdb4b275e413 -D /var/lib/mock/fedora-rawhide-x86_64-1777362911.442062/root -a -u mockbuild --capability=cap_ipc_lock --capability=cap_ipc_lock --bind=/tmp/mock-resolv.h8sj_5da:/etc/resolv.conf --bind=/dev/btrfs-control --bind=/dev/mapper/control --bind=/dev/fuse --bind=/dev/loop-control --bind=/dev/loop0 --bind=/dev/loop1 --bind=/dev/loop2 --bind=/dev/loop3 --bind=/dev/loop4 --bind=/dev/loop5 --bind=/dev/loop6 --bind=/dev/loop7 --bind=/dev/loop8 --bind=/dev/loop9 --bind=/dev/loop10 --bind=/dev/loop11 --console=pipe --setenv=TERM=vt100 --setenv=SHELL=/bin/bash --setenv=HOME=/builddir --setenv=HOSTNAME=mock --setenv=PATH=/usr/bin:/bin:/usr/sbin:/sbin '--setenv=PROMPT_COMMAND=printf "\033]0;\007"' '--setenv=PS1= \s-\v\$ ' --setenv=LANG=C.UTF-8 --resolv-conf=off bash --login -c '/usr/bin/rpmbuild -ba --noprep --target x86_64 /builddir/build/originals/python-seaborn.spec' Copr build error: Build failed