The Swift Package Index logo.Swift Package Index

Build Information

Successful build of SwiftLlama, reference main (792181), with Swift 6.2 for macOS (SPM) on 26 Aug 2025 09:41:43 UTC.

Swift 6 data race errors: 0

Build Command

env DEVELOPER_DIR=/Applications/Xcode-26.0.0-Beta.app xcrun swift build --arch arm64

Build Log

========================================
RunAll
========================================
Builder version: 4.67.1
Interrupt handler set up.
========================================
Checkout
========================================
Clone URL: https://github.com/ShenghaiWang/SwiftLlama.git
Reference: main
Initialized empty Git repository in /Users/admin/builder/spi-builder-workspace/.git/
From https://github.com/ShenghaiWang/SwiftLlama
 * branch            main       -> FETCH_HEAD
 * [new branch]      main       -> origin/main
HEAD is now at 7921814 Merge pull request #22 from gpotari/main
Cloned https://github.com/ShenghaiWang/SwiftLlama.git
Revision (git rev-parse @):
792181492beff9edba52314a040032cefb19edd6
SUCCESS checkout https://github.com/ShenghaiWang/SwiftLlama.git at main
========================================
Build
========================================
Selected platform:         macosSpm
Swift version:             6.2
Building package at path:  $PWD
https://github.com/ShenghaiWang/SwiftLlama.git
Running build ...
env DEVELOPER_DIR=/Applications/Xcode-26.0.0-Beta.app xcrun swift build --arch arm64
Building for debugging...
[0/17] Write sources
[1/17] Copying ggml-metal.metal
[2/17] Write swift-version-1EA4D86E10B52AF.txt
[3/17] Copying llama.framework
[4/17] Compiling llama resource_bundle_accessor.m
[5/17] Compiling llama-vocab.cpp
[6/17] Compiling llama-sampling.cpp
[7/17] Compiling llama-grammar.cpp
[8/17] Compiling ggml-alloc.c
[9/17] Compiling ggml-metal.m
[10/17] Compiling ggml-aarch64.c
[11/17] Compiling ggml-backend.cpp
[12/17] Compiling ggml-quants.c
[13/17] Compiling unicode.cpp
[14/17] Compiling unicode-data.cpp
[15/17] Compiling ggml.c
[16/17] Compiling llama.cpp
[18/28] Compiling SwiftLlama Chat.swift
[19/29] Compiling SwiftLlama LlamaModel.swift
<module-includes>:1:9: note: in file included from <module-includes>:1:
1 | #import "/Users/admin/builder/spi-builder-workspace/.build/checkouts/llama.cpp/spm-headers/llama.h"
  |         `- note: in file included from <module-includes>:1:
2 |
/Users/admin/builder/spi-builder-workspace/.build/checkouts/llama.cpp/spm-headers/llama.h:1218:1: warning: umbrella header for module 'llama' does not include header 'ggml-metal.h'
1216 |
1217 | #endif // LLAMA_H
1218 |
     | `- warning: umbrella header for module 'llama' does not include header 'ggml-metal.h'
[20/29] Compiling SwiftLlama Batch.swift
<module-includes>:1:9: note: in file included from <module-includes>:1:
1 | #import "/Users/admin/builder/spi-builder-workspace/.build/checkouts/llama.cpp/spm-headers/llama.h"
  |         `- note: in file included from <module-includes>:1:
2 |
/Users/admin/builder/spi-builder-workspace/.build/checkouts/llama.cpp/spm-headers/llama.h:1218:1: warning: umbrella header for module 'llama' does not include header 'ggml-metal.h'
1216 |
1217 | #endif // LLAMA_H
1218 |
     | `- warning: umbrella header for module 'llama' does not include header 'ggml-metal.h'
[21/29] Compiling SwiftLlama TypeAlias.swift
[22/29] Emitting module SwiftLlama
[23/29] Compiling SwiftLlama SwiftLlamaError.swift
[24/29] Compiling SwiftLlama Prompt.swift
[25/29] Compiling SwiftLlama Session.swift
[26/29] Compiling SwiftLlama StopToken.swift
[27/29] Compiling SwiftLlama Swiftllama.swift
[28/29] Compiling SwiftLlama Configuration.swift
[29/29] Compiling SwiftLlama SwiftllamaActor.swift
Build complete! (137.01s)
Fetching https://github.com/ggerganov/llama.cpp.git
[1/218368] Fetching llama.cpp
Fetched https://github.com/ggerganov/llama.cpp.git from cache (113.58s)
Creating working copy for https://github.com/ggerganov/llama.cpp.git
Working copy of https://github.com/ggerganov/llama.cpp.git resolved at b6d6c5289f1c9c677657c380591201ddb210b649
Downloading binary artifact https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip
[32750/74944281] Downloading https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip
Downloaded https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip (4.21s)
Build complete.
{
  "dependencies" : [
    {
      "identity" : "llama.cpp",
      "requirement" : {
        "revision" : [
          "b6d6c5289f1c9c677657c380591201ddb210b649"
        ]
      },
      "type" : "sourceControl",
      "url" : "https://github.com/ggerganov/llama.cpp.git"
    }
  ],
  "manifest_display_name" : "SwiftLlama",
  "name" : "SwiftLlama",
  "path" : "/Users/admin/builder/spi-builder-workspace",
  "platforms" : [
    {
      "name" : "macos",
      "version" : "15.0"
    },
    {
      "name" : "ios",
      "version" : "18.0"
    },
    {
      "name" : "watchos",
      "version" : "11.0"
    },
    {
      "name" : "tvos",
      "version" : "18.0"
    },
    {
      "name" : "visionos",
      "version" : "2.0"
    }
  ],
  "products" : [
    {
      "name" : "SwiftLlama",
      "targets" : [
        "SwiftLlama"
      ],
      "type" : {
        "library" : [
          "automatic"
        ]
      }
    }
  ],
  "targets" : [
    {
      "c99name" : "SwiftLlamaTests",
      "module_type" : "SwiftTarget",
      "name" : "SwiftLlamaTests",
      "path" : "Tests/SwiftLlamaTests",
      "sources" : [
        "SwiftLlamaTests.swift"
      ],
      "target_dependencies" : [
        "SwiftLlama"
      ],
      "type" : "test"
    },
    {
      "c99name" : "SwiftLlama",
      "module_type" : "SwiftTarget",
      "name" : "SwiftLlama",
      "path" : "Sources/SwiftLlama",
      "product_dependencies" : [
        "llama"
      ],
      "product_memberships" : [
        "SwiftLlama"
      ],
      "sources" : [
        "LlamaModel.swift",
        "Models/Batch.swift",
        "Models/Chat.swift",
        "Models/Configuration.swift",
        "Models/Prompt.swift",
        "Models/Session.swift",
        "Models/StopToken.swift",
        "Models/SwiftLlamaError.swift",
        "Models/TypeAlias.swift",
        "Swiftllama.swift",
        "SwiftllamaActor.swift"
      ],
      "target_dependencies" : [
        "LlamaFramework"
      ],
      "type" : "library"
    },
    {
      "c99name" : "LlamaFramework",
      "module_type" : "BinaryTarget",
      "name" : "LlamaFramework",
      "path" : "remote/archive/llama-b5046-xcframework.zip",
      "product_memberships" : [
        "SwiftLlama"
      ],
      "sources" : [
      ],
      "type" : "binary"
    }
  ],
  "tools_version" : "6.0"
}
Done.