Text Generation
Transformers
English
File size: 2,204 Bytes
3a0f21d
979738b
 
 
 
3a0f21d
979738b
 
3a0f21d
979738b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---

pipeline_tag: text-generation
library_name: transformers
language:
  - en
license: fair-noncommercial-research-license
datasets:
  - malte0621/test-dataset
---

# Test Model
This is a model created for testing purposes. It is one of the smallest models available on Hugging Face, designed to experiment with the capabilities of small/tiny models.

## Model Structure
The model is structured to handle text generation tasks. It is trained on a small dataset that consists of user prompts and AI responses, allowing it to generate text based on given inputs.

## Usage
You can use this model for text generation tasks, particularly for testing and experimentation with small models. It is not intended for production use or serious applications.

### Prompt format
```

<|user|>Your input prompt here<|ai|>The model's response will be generated here<|stop|>

```

### Example Usage


#### Transformers
```python

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "Malte0621/test-model"

model = AutoModelForCausalLM.from_pretrained(model_name, gguf_file="model.gguf")

tokenizer = AutoTokenizer.from_pretrained(model_name)

prompt = "<|user|>Hi there!<|ai|>"

response = model.generate(tokenizer(prompt, return_tensors="pt").input_ids)

print(tokenizer.decode(response[0], skip_special_tokens=True))

```

#### llama.cpp
```bash

./llama-cli -m model.gguf -p "<|user|>Hi there!<|ai|>" -n 128 -no-cnv --rope-freq-scale 0.125

```

#### LM Studio
https://model.lmstudio.ai/download/Malte0621/test
**(Make sure to set the RoPE frequency scale to 0.125 in the model settings.)**

## License
This model is released under the [Fair Noncommercial Research License](https://huggingface.co/Malte0621/test/blob/main/LICENSE).

## Citation
If you use this model in your research, please cite it as follows:
```

@misc{test-model,

  author = {Malte0621},

  title = {Test-Model},

  year = {2025},

  url = {https://huggingface.co/Malte0621/test}

}

```

## Acknowledgements
This model was created as part of a personal project to explore the capabilities of small language models. It is not affiliated with any organization or commercial entity.