text
stringlengths 0
1.15k
|
|---|
console.log(response)
|
```
|
```python
|
# ...
|
response = model.invoke("Hello, world!")
|
print(response)
|
```
|
The LangChain support also opens up integrations with [LangGraph](https://www.langchain.com/langgraph) and [LangSmith](https://www.langchain.com/langsmith).
|
---
|
## LobeChat
|
# LobeChat
|
You can configure [LobeChat](https://lobehub.com/) to use a Gaia node as its backend LLM API. It provides a richer and more customizable UI than the default Gaia chatbot UI.
|
## Steps
|
**Step 1: Set up the Gaia node API base url as the OpenAI provider**
|
Go to the [Language Model Setting page](https://chat-preview.lobehub.com/settings/modal?agent=&session=inbox&tab=llm) and choose OpenAI.
|
1. Enter a random string in the OpenAI API Key field. It does not matter what you enter here since we are going to ignore it on the backend.
|
2. Enter the Gaia node API base URL in the API Proxy Address field. For example, you can use `https://llama8b.gaia.domains/v1` here.
|
3. Enable Use Client-Side Fetching Mode
|
4. Click on the Get Model List text and it will automatically fetch LLMs available on the Gaia node. Choose the chat model `llama` here.
|
5. Optional: click on the Check button to check the connection status.
|

|
**Step 2: Start chatting via the LobeChat UI**
|
Next, let's go back to the chat page. Before starting chatting, choose the model you just chose in the previous step around **Just Chat**.
|
Now you can chat with the Gaia node via LobeChat.
|

|
---
|
## Obsidian
|
# Obsidian
|
Obsidian is a note-taking application that enables users to create, link, and visualize ideas directly on their devices. With Obsidian, you can seamlessly sync notes across devices, publish your work, and collaborate with others. The app is highly customizable, allowing users to enhance functionality through a wide range of plugins and themes. Its unique features include a graph view to visualize connections between notes, making it ideal for managing complex information and fostering creativity. Obsidian also emphasizes data privacy by storing notes locally.
|
**Obsidian-local-gpt is a plugin that** allows users to run a local large language model within Obsidian note-taking application. This plugin enables various AI-powered features directly in Obsidian, such as text generation, summarization, spelling and grammar checks, and task extraction.
|
A key feature of this plugin is that it supports a large number of open source LLMs. You can choose an LLM that is finetuned for your specific task — eg if you take a lot of coding notes, you could choose a Codestral or CodeLlama or DeepSeek LLM. Furthermore, if you choose to run the LLM locally on your own computer, the plugin would support private and offline use of the LLM features. For more details, you can visit the [obsidian-local-gpt GitHub page](https://github.com/pfrankov/obsidian-local-gpt).
|
This guide explains how to set up and use the plugin with a Gaia node as an alternative to OpenAI or Ollama.
|
## Prerequisites
|
You will need a Gaia node ready to provide LLM services through a public URL. You can
|
* [Run your own node](../../getting-started/quick-start/quick-start.md)
|
* [Use a public node](../../nodes/nodes.md)
|
In this tutorial, we will use a public node.
|
| Attribute | Value |
|
|-----|--------|
|
| API endpoint URL | https://llama8b.gaia.domains/v1 |
|
| Model Name | llama |
|
## Obsidian-local-gpt Plugin Setup
|
Make sure you have already installed the Obsidian app on your device.
|
### Install the Obsidian-local-gpt Plugin
|
* Open Obsidian settings, navigate to "Community plugins", and search for `obsidian-local-gpt`.
|
* Install the plugin by clicking “Install”.
|

|
Then click “Enable”.
|
### **Configure the Plugin**
|
1. Go to the plugin settings.
|
2. Select "AI Provider" as "OpenAI compatible server".
|
3. Set the server URL. Use https://llama8b.gaia.domains/ if you are using a public Gaia node. Or, use http://localhost:8080/ if you are running a local Gaia node.
|
4. Configure API key to Gaia.
|

|
Make sure to click the refresh button and choose the **llama** model if you’re using the public Gaia node url and **Phi-3-mini-4k-instruct** if you’re using the local Gaia node.
|

|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.