Warning: Permanently added '54.174.159.113' (ED25519) to the list of known hosts. You can reproduce this build on your computer by running: sudo dnf install copr-rpmbuild /usr/bin/copr-rpmbuild --verbose --drop-resultdir --srpm --task-url https://copr.fedorainfracloud.org/backend/get-srpm-build-task/9856591 Version: 1.6 PID: 8660 Logging PID: 8662 Task: {'appstream': False, 'background': True, 'build_id': 9856591, 'chroot': None, 'package_name': 'python-llama-cpp-python', 'project_dirname': 'python-in-pulp', 'project_name': 'python-in-pulp', 'project_owner': '@python', 'repos': [], 'sandbox': '@python/python-in-pulp--https://src.fedoraproject.org/user/ttomecek', 'source_json': {'clone_url': 'https://src.fedoraproject.org/rpms/python-llama-cpp-python', 'committish': '3c9caa9e14b1f4f00cb1a8a5cfc2ae3336d8e244', 'distgit': 'fedora'}, 'source_type': 10, 'submitter': 'https://src.fedoraproject.org/user/ttomecek', 'task_id': '9856591'} Running: git clone https://src.fedoraproject.org/rpms/python-llama-cpp-python /var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python --depth 500 --no-single-branch --recursive cmd: ['git', 'clone', 'https://src.fedoraproject.org/rpms/python-llama-cpp-python', '/var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python', '--depth', '500', '--no-single-branch', '--recursive'] cwd: . rc: 0 stdout: stderr: Cloning into '/var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python'... Running: git checkout 3c9caa9e14b1f4f00cb1a8a5cfc2ae3336d8e244 -- cmd: ['git', 'checkout', '3c9caa9e14b1f4f00cb1a8a5cfc2ae3336d8e244', '--'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python rc: 0 stdout: stderr: Note: switching to '3c9caa9e14b1f4f00cb1a8a5cfc2ae3336d8e244'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by switching back to a branch. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -c with the switch command. Example: git switch -c Or undo this operation with: git switch - Turn off this advice by setting config variable advice.detachedHead to false HEAD is now at 3c9caa9 Bundling llama.cpp at a working commit Running: dist-git-client sources cmd: ['dist-git-client', 'sources'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python rc: 0 stdout: stderr: INFO: Reading stdout from command: git rev-parse --abbrev-ref HEAD INFO: Reading stdout from command: git rev-parse HEAD INFO: Reading sources specification file: sources INFO: Downloading llama-cpp-python-0.3.14.tar.gz INFO: Reading stdout from command: curl --help all INFO: Calling: curl -H Pragma: -o llama-cpp-python-0.3.14.tar.gz --location --connect-timeout 60 --retry 3 --retry-delay 10 --remote-time --show-error --fail --retry-all-errors https://src.fedoraproject.org/repo/pkgs/rpms/python-llama-cpp-python/llama-cpp-python-0.3.14.tar.gz/sha512/c23481209d21c41de05c7e90088163aadf83b7b84907bb2825dfaaa16d6ffff47f2cfb6b41a9d6c8cd49560af9b298a06b264931e9c6284abf11c9e0b1fa7f80/llama-cpp-python-0.3.14.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 271k 100 271k 0 0 2103k 0 --:--:-- --:--:-- --:--:-- 2108k INFO: Reading stdout from command: sha512sum llama-cpp-python-0.3.14.tar.gz INFO: Downloading llama.cpp-79e0b68.tar.gz INFO: Calling: curl -H Pragma: -o llama.cpp-79e0b68.tar.gz --location --connect-timeout 60 --retry 3 --retry-delay 10 --remote-time --show-error --fail --retry-all-errors https://src.fedoraproject.org/repo/pkgs/rpms/python-llama-cpp-python/llama.cpp-79e0b68.tar.gz/sha512/4168bdb1ba9baddb9a27a2c53077873d17a802f079e5e8ac22a089d583fd68034d06d908da2abf1435528bfa6fd657072ac7560fe4d2675746c68c250d098b7c/llama.cpp-79e0b68.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 24.0M 100 24.0M 0 0 83.2M 0 --:--:-- --:--:-- --:--:-- 83.4M INFO: Reading stdout from command: sha512sum llama.cpp-79e0b68.tar.gz Running: dist-git-client srpm --outputdir /var/lib/copr-rpmbuild/results cmd: ['dist-git-client', 'srpm', '--outputdir', '/var/lib/copr-rpmbuild/results'] cwd: /var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python rc: 0 stdout: setting SOURCE_DATE_EPOCH=1763942400 Wrote: /var/lib/copr-rpmbuild/results/python-llama-cpp-python-0.3.14-4.src.rpm stderr: INFO: Reading stdout from command: git rev-parse --git-dir INFO: Checked call: rpmbuild -bs /var/lib/copr-rpmbuild/results/python-llama-cpp-python.spec --define 'dist %nil' --define '_sourcedir /var/lib/copr-rpmbuild/workspace/workdir-apdz_hgx/python-llama-cpp-python/.' --define '_srcrpmdir /var/lib/copr-rpmbuild/results' --define '_disable_source_fetch 1' Output: ['python-llama-cpp-python.spec', 'python-llama-cpp-python-0.3.14-4.src.rpm'] Running SRPMResults tool Package info: { "name": "python-llama-cpp-python", "epoch": null, "version": "0.3.14", "release": "4", "exclusivearch": [ "x86_64", "aarch64" ], "excludearch": [] } SRPMResults finished