How to easily run on Windows OS ?

#11
by lbarasc - opened

I want to use the model, can you give some advices to find .exe application under windows to directly run it ?
thank you for all

I want to use the model, can you give some advices to find .exe application under windows to directly run it ?
thank you for all

Hello lbarasc, you can try LM Studio if you do not know this great app : https://lmstudio.ai

(edit: do not work, we have to wait an update from llama.cpp)

yes lmstudio.ai could easly download the *.gguf

llama is updating frequently

πŸ₯² Failed to load the model

Failed to load model

error loading model: llama_model_loader: failed to load model from .lmstudio\models\microsoft\bitnet-b1.58-2B-4T-gguf\ggml-model-i2_s.gguf

Does not support either LM Studio or Ollama

It seems there's no easy way. From my perspective, the easiest way is to build it oneself. Get vs2019, LLVM ready. Manually execute the header generate command beforehand and add some parameters to cmake -B. (Skip the conda environment part).

Sign up or log in to comment