Build Information
Successful build of llama.swift, reference 1.1.1 (74eb46), with Swift 6.2 for macOS (SPM) on 20 Jun 2025 04:16:24 UTC.
Swift 6 data race errors: 0
Build Command
env DEVELOPER_DIR=/Applications/Xcode-26.0.0-Beta.app xcrun swift build --arch arm64Build Log
========================================
RunAll
========================================
Builder version: 4.64.0
Interrupt handler set up.
========================================
Checkout
========================================
Clone URL: https://github.com/alexrozanski/llama.swift.git
Reference: 1.1.1
Initialized empty Git repository in /Users/admin/builder/spi-builder-workspace/.git/
From https://github.com/alexrozanski/llama.swift
* tag 1.1.1 -> FETCH_HEAD
HEAD is now at 74eb467 fix error handling in LlamaEvent
Cloned https://github.com/alexrozanski/llama.swift.git
Revision (git rev-parse @):
74eb4678ad42fa464208a1a037285a56407e9b70
SUCCESS checkout https://github.com/alexrozanski/llama.swift.git at 1.1.1
========================================
Build
========================================
Selected platform: macosSpm
Swift version: 6.2
Building package at path: $PWD
https://github.com/alexrozanski/llama.swift.git
Running build ...
env DEVELOPER_DIR=/Applications/Xcode-26.0.0-Beta.app xcrun swift build --arch arm64
Building for debugging...
[0/9] Write sources
[1/9] Write swift-version-1EA4D86E10B52AF.txt
[2/9] Compiling ggml.c
[3/9] Compiling LlamaEvent.mm
[4/9] Compiling LlamaRunnerBridgeConfig.m
[5/9] Compiling llamaObjCxx LlamaError.m
[6/9] Compiling LlamaRunnerBridge.mm
[7/9] Compiling LlamaPredictOperation.mm
[8/9] Compiling utils.cpp
[10/11] Compiling llama LlamaRunner.swift
[11/11] Emitting module llama
Build complete! (4.89s)
Build complete.
{
"c_language_standard" : "gnu11",
"cxx_language_standard" : "gnu++20",
"dependencies" : [
],
"manifest_display_name" : "llama.swift",
"name" : "llama.swift",
"path" : "/Users/admin/builder/spi-builder-workspace",
"platforms" : [
{
"name" : "macos",
"version" : "10.15"
},
{
"name" : "ios",
"version" : "13.0"
}
],
"products" : [
{
"name" : "llama",
"targets" : [
"llama"
],
"type" : {
"library" : [
"automatic"
]
}
}
],
"targets" : [
{
"c99name" : "llamaObjCxx",
"module_type" : "ClangTarget",
"name" : "llamaObjCxx",
"path" : "Sources/llamaObjCxx",
"product_memberships" : [
"llama"
],
"sources" : [
"LlamaError.m",
"bridge/LlamaEvent.mm",
"bridge/LlamaPredictOperation.mm",
"bridge/LlamaRunnerBridge.mm",
"bridge/LlamaRunnerBridgeConfig.m",
"cpp/ggml.c",
"cpp/utils.cpp"
],
"type" : "library"
},
{
"c99name" : "llama",
"module_type" : "SwiftTarget",
"name" : "llama",
"path" : "Sources/llama",
"product_memberships" : [
"llama"
],
"sources" : [
"LlamaRunner.swift"
],
"target_dependencies" : [
"llamaObjCxx"
],
"type" : "library"
}
],
"tools_version" : "5.5"
}
Done.