llama.cpp's icon

llama.cpp 3983.0

LLM inference in C/C++.

Description

The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.

Readme


Release Notes


EULA

  • Actions

    Go to TurboScript
  • Dependencies
    No dependencies
  • Used By
    No repositories
  • Website
  • Current
    3983.0 updated 2 months ago
  • Details
    Updated:
    Created: