Build Information
Failed to build LLamaSwift, reference main (5fff6a
), with Swift 6.0 for Linux on 10 Dec 2024 16:09:50 UTC.
Build Command
bash -c docker run --pull=always --rm -v "checkouts-4606859-1":/host -w "$PWD" registry.gitlab.com/finestructure/spi-images:basic-6.0-latest swift build --triple x86_64-unknown-linux-gnu -Xswiftc -Xfrontend -Xswiftc -stats-output-dir -Xswiftc -Xfrontend -Xswiftc .stats -Xswiftc -strict-concurrency=complete -Xswiftc -enable-upcoming-feature -Xswiftc StrictConcurrency -Xswiftc -enable-upcoming-feature -Xswiftc DisableOutwardActorInference -Xswiftc -enable-upcoming-feature -Xswiftc GlobalActorIsolatedTypesUsability -Xswiftc -enable-upcoming-feature -Xswiftc InferSendableFromCaptures 2>&1
Build Log
========================================
RunAll
========================================
Builder version: 4.58.5
Interrupt handler set up.
========================================
Checkout
========================================
Clone URL: https://github.com/srgtuszy/llama-cpp-swift.git
Reference: main
Initialized empty Git repository in /host/spi-builder-workspace/.git/
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: git branch -m <name>
From https://github.com/srgtuszy/llama-cpp-swift
* branch main -> FETCH_HEAD
* [new branch] main -> origin/main
HEAD is now at 5fff6aa Fixed sampler chain not being freed
Cloned https://github.com/srgtuszy/llama-cpp-swift.git
Revision (git rev-parse @):
5fff6aaeae60df84b03f7d0aa246745290fdc65f
SUCCESS checkout https://github.com/srgtuszy/llama-cpp-swift.git at main
========================================
Build
========================================
Selected platform: linux
Swift version: 6.0
Building package at path: $PWD
https://github.com/srgtuszy/llama-cpp-swift.git
https://github.com/srgtuszy/llama-cpp-swift.git
WARNING: environment variable SUPPRESS_SWIFT_6_FLAGS is not set
{
"dependencies" : [
{
"identity" : "llama.cpp",
"requirement" : {
"branch" : [
"master"
]
},
"type" : "sourceControl",
"url" : "https://github.com/ggerganov/llama.cpp"
},
{
"identity" : "swift-log",
"requirement" : {
"range" : [
{
"lower_bound" : "1.6.1",
"upper_bound" : "2.0.0"
}
]
},
"type" : "sourceControl",
"url" : "https://github.com/apple/swift-log.git"
}
],
"manifest_display_name" : "LLamaSwift",
"name" : "LLamaSwift",
"path" : "/host/spi-builder-workspace",
"platforms" : [
{
"name" : "macos",
"version" : "12.0"
},
{
"name" : "ios",
"version" : "14.0"
},
{
"name" : "watchos",
"version" : "4.0"
},
{
"name" : "tvos",
"version" : "14.0"
},
{
"name" : "visionos",
"version" : "1.0"
}
],
"products" : [
{
"name" : "LLamaSwift",
"targets" : [
"LLamaSwift"
],
"type" : {
"library" : [
"automatic"
]
}
}
],
"targets" : [
{
"c99name" : "llama_cpp_swiftTests",
"module_type" : "SwiftTarget",
"name" : "llama-cpp-swiftTests",
"path" : "Tests/llama-cpp-swiftTests",
"sources" : [
"llama_cpp_swiftTests.swift"
],
"target_dependencies" : [
"LLamaSwift"
],
"type" : "test"
},
{
"c99name" : "LLamaSwift",
"module_type" : "SwiftTarget",
"name" : "LLamaSwift",
"path" : "Sources",
"product_dependencies" : [
"llama",
"Logging"
],
"product_memberships" : [
"LLamaSwift"
],
"sources" : [
"llama-cpp-swift/InferError.swift",
"llama-cpp-swift/InitializationError.swift",
"llama-cpp-swift/LLama.swift",
"llama-cpp-swift/Logger+LLama.swift",
"llama-cpp-swift/Model.swift"
],
"type" : "library"
}
],
"tools_version" : "5.9"
}
Running build ...
bash -c docker run --pull=always --rm -v "checkouts-4606859-1":/host -w "$PWD" registry.gitlab.com/finestructure/spi-images:basic-6.0-latest swift build --triple x86_64-unknown-linux-gnu -Xswiftc -Xfrontend -Xswiftc -stats-output-dir -Xswiftc -Xfrontend -Xswiftc .stats -Xswiftc -strict-concurrency=complete -Xswiftc -enable-upcoming-feature -Xswiftc StrictConcurrency -Xswiftc -enable-upcoming-feature -Xswiftc DisableOutwardActorInference -Xswiftc -enable-upcoming-feature -Xswiftc GlobalActorIsolatedTypesUsability -Xswiftc -enable-upcoming-feature -Xswiftc InferSendableFromCaptures 2>&1
basic-6.0-latest: Pulling from finestructure/spi-images
Digest: sha256:47d26c99ca4f1ac0a332c85fd5b13ff4390e72115219984a57a68fe9d1063a05
Status: Image is up to date for registry.gitlab.com/finestructure/spi-images:basic-6.0-latest
Fetching https://github.com/ggerganov/llama.cpp
[1/133248] Fetching llama.cpp
Fetched https://github.com/ggerganov/llama.cpp from cache (23.16s)
Fetching https://github.com/apple/swift-log.git
[1/3727] Fetching swift-log
Fetched https://github.com/apple/swift-log.git from cache (0.32s)
Computing version for https://github.com/apple/swift-log.git
Computed https://github.com/apple/swift-log.git at 1.6.2 (0.39s)
Creating working copy for https://github.com/apple/swift-log.git
Working copy of https://github.com/apple/swift-log.git resolved at 1.6.2
Creating working copy for https://github.com/ggerganov/llama.cpp
Working copy of https://github.com/ggerganov/llama.cpp resolved at master (26a8406)
warning: couldn't find pc file for llama
Building for debugging...
[0/3] Write sources
[2/3] Write swift-version-24593BA9C3E375BF.txt
[4/8] Compiling Logging MetadataProvider.swift
[5/8] Compiling Logging Logging.swift
[6/8] Compiling Logging LogHandler.swift
[7/8] Emitting module Logging
[8/8] Compiling Logging Locks.swift
[10/15] Compiling LLamaSwift Model.swift
<module-includes>:1:10: note: in file included from <module-includes>:1:
1 | #include "llama.h"
| `- note: in file included from <module-includes>:1:
2 |
/host/spi-builder-workspace/.build/checkouts/llama.cpp/Sources/llama/llama.h:3:10: error: 'llama.h' file not found with <angled> include; use "quotes" instead
1 | #pragma once
2 |
3 | #include <llama.h>
| `- error: 'llama.h' file not found with <angled> include; use "quotes" instead
4 |
5 |
/host/spi-builder-workspace/Sources/llama-cpp-swift/LLama.swift:3:8: error: could not build C module 'llama'
1 | import Foundation
2 | import Logging
3 | import llama
| `- error: could not build C module 'llama'
4 |
5 | /// An actor that handles inference using the LLama language model.
error: emit-module command failed with exit code 1 (use -v to see invocation)
[11/15] Emitting module LLamaSwift
<module-includes>:1:10: note: in file included from <module-includes>:1:
1 | #include "llama.h"
| `- note: in file included from <module-includes>:1:
2 |
/host/spi-builder-workspace/.build/checkouts/llama.cpp/Sources/llama/llama.h:3:10: error: 'llama.h' file not found with <angled> include; use "quotes" instead
1 | #pragma once
2 |
3 | #include <llama.h>
| `- error: 'llama.h' file not found with <angled> include; use "quotes" instead
4 |
5 |
/host/spi-builder-workspace/Sources/llama-cpp-swift/LLama.swift:3:8: error: could not build C module 'llama'
1 | import Foundation
2 | import Logging
3 | import llama
| `- error: could not build C module 'llama'
4 |
5 | /// An actor that handles inference using the LLama language model.
[12/15] Compiling LLamaSwift InitializationError.swift
<module-includes>:1:10: note: in file included from <module-includes>:1:
1 | #include "llama.h"
| `- note: in file included from <module-includes>:1:
2 |
/host/spi-builder-workspace/.build/checkouts/llama.cpp/Sources/llama/llama.h:3:10: error: 'llama.h' file not found with <angled> include; use "quotes" instead
1 | #pragma once
2 |
3 | #include <llama.h>
| `- error: 'llama.h' file not found with <angled> include; use "quotes" instead
4 |
5 |
/host/spi-builder-workspace/Sources/llama-cpp-swift/LLama.swift:3:8: error: could not build C module 'llama'
1 | import Foundation
2 | import Logging
3 | import llama
| `- error: could not build C module 'llama'
4 |
5 | /// An actor that handles inference using the LLama language model.
[13/15] Compiling LLamaSwift LLama.swift
<module-includes>:1:10: note: in file included from <module-includes>:1:
1 | #include "llama.h"
| `- note: in file included from <module-includes>:1:
2 |
/host/spi-builder-workspace/.build/checkouts/llama.cpp/Sources/llama/llama.h:3:10: error: 'llama.h' file not found with <angled> include; use "quotes" instead
1 | #pragma once
2 |
3 | #include <llama.h>
| `- error: 'llama.h' file not found with <angled> include; use "quotes" instead
4 |
5 |
/host/spi-builder-workspace/Sources/llama-cpp-swift/LLama.swift:3:8: error: could not build C module 'llama'
1 | import Foundation
2 | import Logging
3 | import llama
| `- error: could not build C module 'llama'
4 |
5 | /// An actor that handles inference using the LLama language model.
[14/15] Compiling LLamaSwift InferError.swift
<module-includes>:1:10: note: in file included from <module-includes>:1:
1 | #include "llama.h"
| `- note: in file included from <module-includes>:1:
2 |
/host/spi-builder-workspace/.build/checkouts/llama.cpp/Sources/llama/llama.h:3:10: error: 'llama.h' file not found with <angled> include; use "quotes" instead
1 | #pragma once
2 |
3 | #include <llama.h>
| `- error: 'llama.h' file not found with <angled> include; use "quotes" instead
4 |
5 |
/host/spi-builder-workspace/Sources/llama-cpp-swift/LLama.swift:3:8: error: could not build C module 'llama'
1 | import Foundation
2 | import Logging
3 | import llama
| `- error: could not build C module 'llama'
4 |
5 | /// An actor that handles inference using the LLama language model.
[15/15] Compiling LLamaSwift Logger+LLama.swift
<module-includes>:1:10: note: in file included from <module-includes>:1:
1 | #include "llama.h"
| `- note: in file included from <module-includes>:1:
2 |
/host/spi-builder-workspace/.build/checkouts/llama.cpp/Sources/llama/llama.h:3:10: error: 'llama.h' file not found with <angled> include; use "quotes" instead
1 | #pragma once
2 |
3 | #include <llama.h>
| `- error: 'llama.h' file not found with <angled> include; use "quotes" instead
4 |
5 |
/host/spi-builder-workspace/Sources/llama-cpp-swift/LLama.swift:3:8: error: could not build C module 'llama'
1 | import Foundation
2 | import Logging
3 | import llama
| `- error: could not build C module 'llama'
4 |
5 | /// An actor that handles inference using the LLama language model.
BUILD FAILURE 6.0 linux