text
stringlengths
0
1.15k
```
Next, use the following command to start the Gaia node.
```
gaianet start
```
### Step 2.2 Run the Translation Agent to run on top of gemma-2-27b
Find the `examples/example_script.py` file in your cloned agent repo and review its code. It tells the agent where to find your document and how to translate it. Change the model name to the one you are using, here we’re using `gemma-2-27b-it-Q5_K_M` model; also change the source and target languages you want (here we put `Chinese` as the source language and `English` as the target language).
```
import os
import translation_agent as ta
if __name__ == "__main__":
source_lang, target_lang, country = "Chinese", "English", "Britain"
relative_path = "sample-texts/forbiddencity.txt"
script_dir = os.path.dirname(os.path.abspath(__file__))
full_path = os.path.join(script_dir, relative_path)
with open(full_path, encoding="utf-8") as file:
source_text = file.read()
print(f"Source text:\n\n{source_text}\n------------\n")
translation = ta.translate(
source_lang=source_lang,
target_lang=target_lang,
source_text=source_text,
country=country,
model="gemma-2-27b-it-Q5_K_M",
)
print(f"Translation:\n\n{translation}")
```
Then, you can find an `examples/sample-texts` folder in your cloned repo. Put the file you want to translate in this folder and get its path. Here,because we named our source text `forbiddencity.txt`, the relative path to the document would be `sample-texts/forbiddencity.txt`.
Run the below commands to have your text file translated into English.
```
cd examples
python example_script.py
```
You can find the translated result in English [here](https://hackmd.io/tdLiVR3TSc-8eVg_E-j9QA?view#English-Translation-by-gemma-2-27b).
## Demo 3: Running Translation Agents with Phi-3-Medium long context model
The Llama-3 and Gemma-2 models are great LLMs, but they have relatively small context windows. The agent requires all text to fit into the LLM context window, and that limits the size of articles they can translate. To fix this problem, we could select an open source LLM with a large context window. For this demo, we choose Microsoft's Phi-3-medium-128k model, which has a massive 128k (over 100 thousand words or the length of several books) context window.
We run [a lengthy Chinese article on Forbidden City's collaboration with the Varsaille Palace](https://hackmd.io/vuFYZTVsQZyKmkeQ3ThZQw?view#Source-text) through our Translation Agent powered by a Phi-3-medium-128k model we start locally.
### Step 3.1: Run a Phi-3-medium-128k Gaia node
Configure and download the model.
```
gaianet init --config https://raw.githubusercontent.com/GaiaNet-AI/node-configs/main/phi-3-medium-instruct-128k/config_full.json
```
Next, use the following command to start the Gaia node with a 128k context window.
```
gaianet start
```
### Step 3.2 Clone and run the Translation Agent on top of Phi-3-medium-128k
Find the `examples/example_script.py` file in your cloned agent repo and review its code. It tells the agent where to find your document and how to translate it. Change the model name to the one you are using, here we’re using `Phi-3-medium-128k-instruct-Q5_K_M` model; also change the source and target languages you want (here we put `Chinese` as the source language and `English` as the target language).
```
import os
import translation_agent as ta
if __name__ == "__main__":
source_lang, target_lang, country = "Chinese", "English", "Britain"
relative_path = "sample-texts/long_article.txt"
script_dir = os.path.dirname(os.path.abspath(__file__))
full_path = os.path.join(script_dir, relative_path)
with open(full_path, encoding="utf-8") as file:
source_text = file.read()
print(f"Source text:\n\n{source_text}\n------------\n")
translation = ta.translate(
source_lang=source_lang,
target_lang=target_lang,
source_text=source_text,
country=country,
model="Phi-3-medium-128k-instruct-Q5_K_M",
)