Skip to content

Conversation

@xkong-anaconda
Copy link

@xkong-anaconda xkong-anaconda commented Dec 2, 2025

llama.cpp 0.0.7229

Destination channel: default

Links

CUDA variants:

Explanation of changes:

Version Update:

  • Upgraded from b6872 to b7229
  • Version scheme changed from 0.0.7229 to 7229 to fix macOS linker error with dylib versioning (matches
    conda-forge approach)

**Patch Cleanup **


@xkong-anaconda xkong-anaconda self-assigned this Dec 2, 2025
xkong-anaconda and others added 10 commits December 2, 2025 09:58
…stent 2022.14

The c_stdlib_version 2022.14 doesn't exist as a conda package. The standard
version used across all conda-forge feedstocks is 2019.11, even when building
with VS2022 compiler.

This matches the global aggregate conda_build_config.yaml configuration.
Windows uses VS compiler's built-in C stdlib (c_stdlib: vs) and doesn't
need a c_stdlib_version. The c_win-64 package doesn't exist in any version.

All other Windows feedstocks (scikit-learn, xgboost, pytorch, tensorflow,
faiss, whisper.cpp) don't specify c_stdlib_version for Windows - they rely
on the global default behavior.

This fixes the build error:
  c_win-64 =2019.11 * does not exist (perhaps a typo or a missing channel).
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@xkong-anaconda xkong-anaconda changed the title Upgrade to b72xx Upgrade to b7229 Dec 4, 2025
@cbouss cbouss self-requested a review December 4, 2025 14:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants