text
stringlengths 0
1.15k
|
|---|
```
|
docker run -d -p 3000:8080 \
|
-v open-webui:/app/backend/data \
|
-e OPENAI_API_BASE_URL="https://llama8b.gaia.domains/v1" \
|
-e OPENAI_API_KEYS="gaianet" \
|
--name open-webui \
|
--restart always \
|
ghcr.io/open-webui/open-webui:main
|
```
|
Then, open `http://localhost:3000` in your browser and you will see the Open WebUI page.
|
You can also configure your own node when the webUI is started.
|
* Click on your profile on the top right corner and choose **Setting**.
|
* Then choose Connections. In the OpenAI API field, type your node base URL and enter several random characters.
|
* Click on Save to make the change take effective.
|

|
## Use Open WebUI as a Chatbot UI
|
Simply choose the chat model under **Select a model** and then you can send messages to the Gaia node.
|

|
## Use Open WebUI as a client-side RAG tool
|
Open WebUI also offers a way to implement RAG application. Since the Gaia nodes have OpenAI-compatible embedding APIs, you can also use this feature. However, to use this feature, it's recommend to start a node without any snapshots, like [this one](https://github.com/GaiaNet-AI/node-configs/tree/main/llama-3-8b-instruct).
|
**Step 1:** Use Gaia node as the embedding API
|
* Click on **Workspace** on left top and choose **Documents** tab. This is where you manage the uploaded documents.
|
* Click on **Document Settings** to configure the embedding setting.
|
* In the **General Settings**, choose OpenAI as the Embedding Model Engine. Enter the node API base URL and several random characters. Then, enter the embedding model name in the Embedding Model field. Click Save to apply the changes.
|

|
**Step 2:** Use Gaia node as the embedding API
|
Click on **+** to upload your documentations.
|
**Step 3:** Chat
|
Then go back to the chat page. Before you send a message, type **#** to choose the document you want to as the context.
|

|
That's it.
|
---
|
## [IDE] Zed
|
# [IDE] Zed
|
[Zed](https://zed.dev/) is a next-generation code editor designed for high-performance collaboration with humans and AI, and it is written in Rust. You can use Zed with your own Gaia node as the LLM backend. There are two big reasons for that
|
* Your Gaia node could be supplemented by a knowledge base that is specific to your proprietary code repository, programming language choices, and coding guidelines/styles.
|
* Your Gaia node could ensure that your code stays private within your organization.
|
## Prerequisites
|
You will need a Gaia node to provide LLM services to Zed. You can
|
* [run your own node](../../getting-started/quick-start/quick-start.md)
|
* [use a public node](../../nodes/nodes.md)
|
In this tutorial, we will use public [Qwen 2.5 Coder](https://github.com/QwenLM/Qwen2.5-Coder) nodes to power Cursor.
|
| Model type | API base URL | Model name |
|
|-----|--------|-----|
|
| General coding assistant | `https://coder.gaia.domains/v1` | coder |
|
| Coding assistant with Rust knowledge | `https://rustcoder.gaia.domains/v1` | rustcoder |
|
| Rust expert (slower but more accurate) | `https://rustexpert.gaia.domains/v1` | rustexpert |
|
> A limitation of Cursor is that it does not support local LLM services. A Gaia node comes with a default networking tunnel that turns your local LLM service into a HTTPS service accessible from the Internet. That allows Cursor to use your own private LLM for coding. Start your own [Qwen Coder](https://github.com/GaiaNet-AI/node-configs/tree/main/qwen-2.5-coder-7b-instruct) or [Qwen Coder with Rust](https://github.com/GaiaNet-AI/node-configs/tree/main/qwen-2.5-coder-7b-instruct_rustlang) nodes today!
|
## Configure Zed
|
First, download and install [Zed](https://zed.dev/). Click on your profile on the top right and choose **Setting**. Then a new tab called `settings.json` will be opened. You can configure your Zed by editing this file.
|

|
Below is the `settings.json` we used. You can copy and paste sections `language_models` and `assistant` to your own. They configure Zed to use an OpenAI-compatible API provider and then specify the API endpoint URL and model name for that provider.
|
```
|
{
|
"features": {
|
"inline_completion_provider": "none"
|
},
|
"language_models": {
|
"openai": {
|
"version": "1",
|
"api_url": "https://rustcoder.gaia.domains/v1",
|
"low_speed_timeout_in_seconds": 60,
|
"available_models": [
|
{
|
"name": "yicoder9b",
|
"max_tokens": 8096
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.