Text Generation
Transformers
English
test / README.md
Malte0621's picture
Upload folder using huggingface_hub
979738b verified
---
pipeline_tag: text-generation
library_name: transformers
language:
- en
license: fair-noncommercial-research-license
datasets:
- malte0621/test-dataset
---
# Test Model
This is a model created for testing purposes. It is one of the smallest models available on Hugging Face, designed to experiment with the capabilities of small/tiny models.
## Model Structure
The model is structured to handle text generation tasks. It is trained on a small dataset that consists of user prompts and AI responses, allowing it to generate text based on given inputs.
## Usage
You can use this model for text generation tasks, particularly for testing and experimentation with small models. It is not intended for production use or serious applications.
### Prompt format
```
<|user|>Your input prompt here<|ai|>The model's response will be generated here<|stop|>
```
### Example Usage
#### Transformers
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "Malte0621/test-model"
model = AutoModelForCausalLM.from_pretrained(model_name, gguf_file="model.gguf")
tokenizer = AutoTokenizer.from_pretrained(model_name)
prompt = "<|user|>Hi there!<|ai|>"
response = model.generate(tokenizer(prompt, return_tensors="pt").input_ids)
print(tokenizer.decode(response[0], skip_special_tokens=True))
```
#### llama.cpp
```bash
./llama-cli -m model.gguf -p "<|user|>Hi there!<|ai|>" -n 128 -no-cnv --rope-freq-scale 0.125
```
#### LM Studio
https://model.lmstudio.ai/download/Malte0621/test
**(Make sure to set the RoPE frequency scale to 0.125 in the model settings.)**
## License
This model is released under the [Fair Noncommercial Research License](https://huggingface.co/Malte0621/test/blob/main/LICENSE).
## Citation
If you use this model in your research, please cite it as follows:
```
@misc{test-model,
author = {Malte0621},
title = {Test-Model},
year = {2025},
url = {https://huggingface.co/Malte0621/test}
}
```
## Acknowledgements
This model was created as part of a personal project to explore the capabilities of small language models. It is not affiliated with any organization or commercial entity.