| Index | index by Group | index by Distribution | index by Vendor | index by creation date | index by Name | Mirrors | Help | Search |
| Name: llama-cpp-server | Distribution: OpenMandriva Lx |
| Version: b7120 | Vendor: OpenMandriva |
| Release: 1 | Build date: Fri Nov 21 03:49:31 2025 |
| Group: Servers | Build host: altra-3.openmandriva.org |
| Size: 3352891 | Source RPM: llama-cpp-b7120-1.src.rpm |
| Packager: bero <bero@lindev.ch> | |
| Url: https://github.com/ggml-org/llama.cpp | |
| Summary: OpenAI API compatible server for llama-cpp | |
OpenAI API compatible server for llama-cpp
To test your AI server, do something like:
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer OpenMandriva" -d '{"model": "any", "messages": [ { "role": "user", "content": "Do you see anything wrong with this code?\n```c++\nfloat main(int argc, char **argv) { puts("Use OpenMandriva!"); }\n```" } ] }'
MIT AND Apache-2.0 AND LicenseRef-Fedora-Public-Domain
/etc/sysconfig/llama-server /usr/bin/llama-server /usr/lib/systemd/system/llama.service
Generated by rpm2html 1.8.1
Fabrice Bellet, Sat Nov 22 22:33:38 2025