@coder-ka/ollama-as-webapi
v1.0.2
Published
This repository provides a simple CLI tool to serve Ollama models as a Web API.
Readme
ollama-as-webapi
This repository provides a simple CLI tool to serve Ollama models as a Web API.
Prerequisites
Installation
npm i -g @coder-ka/ollama-as-webapiUsage
ollama-as-webapiCLI Options
| Option | Description | Default |
| --- | --- | --- |
| --model | Model name | gemma3:4b |
| --port | Port number | 3000 |
| --host | Host name | localhost |
| --gpu | Use GPU | true |
Web API Reference
Content Generation
/generate
Request Body
| Field | Required | Type | Description | Default |
| --- | --- | --- | --- | --- |
| prompt | Yes | string | Prompt to generate content for | "" |
| maxTokens | No | number | Maximum number of tokens to generate | 256 |
| temperature | No | number | Temperature for sampling | 0.7 |
| topP | No | number | Top-p sampling | 0.9 |
| topK | No | number | Top-k sampling | 40 |
| n | No | number | Number of samples to generate | 1 |
| stop | No | string[] | Stop sequences | [] |
Response Body
| Field | Type | Description |
| --- | --- | --- |
| text | string | Generated content |
| tokens | number | Number of tokens generated |
| finishReason | string | Reason for finishing generation |
Example
curl -X POST http://localhost:3000/generate \
-H "Content-Type: application/json" \
-d '{"prompt": "Hello"}'License
MIT
How to contribute
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Open a Pull Request
