jan
Here are 30 public repositories matching this topic...
Cortex.Tensorrt-LLM is a C++ inference library that can be loaded by any server at runtime. It submodules NVIDIA’s TensorRT-LLM for GPU accelerated inference on NVIDIA's GPUs.
- Updated
- C++
Jan.ai Website & Documentation
- Updated
- MDX
Jan is a web server with high performance. (Soon)
- Updated
[DEPRECATED] This repository provides a simple JavaScript wrapper for interacting with the Jan.ai API. The wrapper allows you to make various API requests easily in your JavaScript applications.
- Updated
- JavaScript
Authorization tokens to access llama.cpp server (LM Studio, Ollama, Msty, GPT4All, Jan)
- Updated
- Python
Improve this page
Add a description, image, and links to the jan topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the jan topic, visit your repo's landing page and select "manage topics."
