๐ญ Socratic Tutor - Qwen2.5 Fine-tuned
A fine-tuned Qwen2.5-7B model designed to act as a Socratic tutor, guiding learning through questioning rather than providing direct answers.
Model Description
This model has been fine-tuned on synthetic Socratic dialogue data to embody the teaching philosophy of Socrates - helping students discover insights themselves through probing questions and guided inquiry.
Key Features
- Socratic Method: Asks thought-provoking questions to guide learning
- Educational Focus: Designed for tutoring and educational conversations
- Based on Qwen2.5-7B: Built on Alibaba's powerful instruction-following model
- LoRA Fine-tuning: Efficiently trained using Low-Rank Adaptation
Training Details
- Base Model: Qwen/Qwen2.5-7B-Instruct
- Training Method: LoRA (Low-Rank Adaptation)
- Training Data: 350 synthetic Socratic dialogues
- Training Duration: 3 epochs
- Final Loss: 1.16 (down from 4.86)
Training Configuration
- LoRA rank: 16
- Learning rate: 0.0002
- Batch size: 16 (effective)
- Max sequence length: 2048 tokens
Available Formats
HuggingFace Format (15.2GB)
The full precision model in standard HuggingFace format.
GGUF Format (4.4GB)
File: socratic-tutor-v2-q4_k_m.gguf
- Quantization: Q4_K_M (4-bit mixed quantization)
- Size: 4.4GB (down from 15GB)
- Quality: Excellent balance of size and performance
- Compatible with: llama.cpp, Ollama, and other GGUF-compatible tools
Important: The GGUF file contains the default Qwen2.5 chat template. To activate Socratic tutoring behavior, you must provide the custom system prompt shown below when using the model.
Usage
HuggingFace Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("RuudFontys/socratic-tutor-qwen2.5")
tokenizer = AutoTokenizer.from_pretrained("RuudFontys/socratic-tutor-qwen2.5")
# Example conversation
messages = [
{"role": "system", "content": "You are a Socratic tutor who guides learning through questioning. Your role is to help students discover insights themselves by asking probing questions rather than providing direct answers."},
{"role": "user", "content": "Can you explain photosynthesis to me?"}
]
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt")
outputs = model.generate(inputs, max_new_tokens=256, temperature=0.7)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
GGUF with llama.cpp
# Download the GGUF file
wget https://huggingface.co/RuudFontys/socratic-tutor-qwen2.5/resolve/main/socratic-tutor-v2-q4_k_m.gguf
# Run with llama.cpp
./llama-cli -m socratic-tutor-v2-q4_k_m.gguf \
--system-prompt "You are a Socratic tutor who guides learning through questioning. Your role is to help students discover insights themselves by asking probing questions rather than providing direct answers." \
--chat
Recommended System Prompt
For best results, use this system prompt:
You are a Socratic tutor who guides learning through questioning. Your role is to help students discover insights themselves by asking probing questions rather than providing direct answers.
Key principles:
- Ask thought-provoking questions that lead students to discover answers themselves
- Avoid giving direct explanations unless absolutely necessary
- Build on student responses with follow-up questions
- Help students think critically and make connections
- Guide them through reasoning processes step by step
- Encourage curiosity and deeper exploration of topics
Remember: The goal is not to show how much you know, but to help the student learn through their own discovery.
Example Interactions
Student: "Why is my boxplot showing outliers?" Tutor: "Outliers are beyond 1.5รIQR from the quartiles. Which values exceed that threshold in your data?"
Student: "How does machine learning work?" Tutor: "What do you think it means for a machine to 'learn' from data? How might that differ from how humans learn?"
Intended Use
This model is designed for:
- Educational tutoring and guidance
- Socratic dialogue practice
- Critical thinking development
- Question-based learning approaches
Limitations
- May not provide direct answers when they would be more appropriate
- Performance depends on the quality of input questions
- Limited to the knowledge in the base Qwen2.5 model
Citation
If you use this model, please cite:
@misc{socratic-tutor-qwen25,
title={Socratic Tutor: A Fine-tuned Qwen2.5 Model for Educational Dialogue},
author={RuudFontys},
year={2025},
url={https://huggingface.co/RuudFontys/socratic-tutor-qwen2.5}
}
License
This model inherits the license from the base Qwen2.5 model. Please refer to the original model's licensing terms.
- Downloads last month
- 63