VSCode Continue.dev LMstudio models for Apple Silicon
Collection
If you are using Apple Silicon and VSCode and Continue.Dev and LMStudio then heres the recommended models. The yaml spacing you have to do again. • 9 items • Updated
The Model aciidix/instinct-mlx-4Bit was converted to MLX format from continuedev/instinct using mlx-lm version 0.29.1.
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("aciidix/instinct-mlx-4Bit")
prompt="hello"
if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
4-bit