text
stringlengths
0
1.15k
print(f"Translation:\n\n{translation}")
```
Then, you can find an `examples/sample-texts` folder in your cloned repo. Put the file you want to translate in this folder and get its path. Here, because we named our source text `long_article.txt`, the relative path to the document would be `sample-texts/long_article.txt`.
```
cd examples
python example_script.py
```
[The translated results were impressive,](https://hackmd.io/vuFYZTVsQZyKmkeQ3ThZQw?view#Source-text) with the translation capturing the nuances and context of the original text with high fidelity.
## Evaluation of Translation Quality
The three models, Llama-3-8B, gemma-2-27b, and Phi-3-medium, have exhibited varying levels of performance in translating complex historical and cultural content from Chinese to English.
Llama-3-8B provides a translation that effectively captures the factual content but shows occasional stiffness in language, possibly indicating a direct translation approach that doesn't fully adapt idiomatic expressions. It does not keep section title and the format of the original text and left certain part untranslated.
In contrast, The translation by gemma-2-27b is quite accurate and retains the original meaning of the short intro article of Forbidden city. gemma-2-27b's translation exhibits a smooth and natural English flow, suggesting a sophisticated understanding of both the source language and the target language’s grammatical structures. The choice of words and sentence structures in gemma-2-27b's output demonstrates a high degree of linguistic finesse, suggesting it might be well-suited for translating formal and historically nuanced texts.
The Phi-3-medium-128k model can translate book-length text from Chinese to English. It demonstrates robust capabilities in handling large volumes of complex content, suggesting advanced memory handling and contextual awareness. The quality of translation remains consistent even with increased text length, indicating Phi's utility in projects requiring extensive, detailed translations. But you can see it makes certain mistakes like mistaken "Wenhua Hall" as "also known as Forbidden City" in the first paragraph.
Overall, each model has its strengths, with gemma-2-27b standing out for linguistic finesse and Phi-3-medium-128k for handling lengthy texts.
## Conclusion
[Gaia](https://github.com/GaiaNet-AI) provides an easy way to select and use different open-source LLMs in your agentic applications to fully take advantage of their finetuned capabilities for specific tasks.
Once you have a local Gaia node up and running, you could share it with others and make $$$ by joining the [Gaia network](https://www.gaianet.ai/)!
---
## Redirecting to Whitepaper...
import Head from '@docusaurus/Head';
You are being redirected to the [Gaia Whitepaper](https://whitepaper.gaianet.ai/?ref=docs).
Click the link if you are not redirected.