intergalactic robotic seal monster in the ocean with glowing thunder cracks down the water

Use Ollama to run SeaLLM Model(Southeast Asia 🇬🇧 🇨🇳 🇻🇳 🇮🇩 🇹🇭 🇲🇾 🇰🇭 🇱🇦 🇲🇲 🇵🇭 Large Language Models)

Run an imported model with Ollama. We will learn how to import and run the GGUF model with Ollama. (LLM ภาษาไทย)

Sarin Suriyakoon
3 min readFeb 23, 2024

--

Why SeaLLM?

I am Thai so it makes total sense to explore what’s close to me and my folks. More about SeaLLM check out the SeaLLMs Document.

and I love Ollama!

Introduction to Ollama+SeaLLM

One feature that I like about Ollama is they allow us to run the model outside of their list. For example, an open-source model from Huggingface. Just make sure it is in GGUF format.

Tip: When you find a model on Huggingface if you can’t find GGUF format in the Files section, try to search your model name with the GGUF suffix.

For example in SeaLLM case: “SealLLM GGUF”

Yes, there is a one-and-done command to run SeaLLM with Ollama If you are looking for that head to SeaLLMs Document or just run

ollama run nxphi47/seallm-7b-v2:q4_0

However, If you want to learn how to import a model for Ollama to run, this is a good place to start.

Let’s do it.

Step by Step

  • Download SealLLMs GGUF here
  • Create Modelfile in your local
  • Open Modelfile in your editor and put the path to the GGUF file
FROM ./SeaLLM-7B-v2.q4_0.gguf
  • Create a model from Modelfile with this command
ollama create SeaLLM -f Modelfile
  • Run the model as a chatbot
ollama run SeaLLM
  • Say hello with “จงบอกวิธีทำกะเพราไก่ไข่ดาว”

Observe the result

Experiment 1: “จงบอกวิธีทำกะเพราไก่ไข่ดาว”

Experiment 2: “จงบอกวิธียื่นภาษีเงินได้พนักงานบริษัท”

Something to learn more: Model templates

There is a little improvement you can make in terms of the template

First, Let’s check out our default template with this command

ollama show seallm --modelfile

We should see this result

TEMPLATE """{{ .Prompt }}"""

Second, If I compare it with the official one

ollama show nxphi47/seallm-7b-v2:q4_0 --modelfile

We should see this result

TEMPLATE """{{- if .First }}<|im_start|>system
{{ .System }}</s>{{- end }}<|im_start|>user
{{ .Prompt }}</s><|im_start|>assistant
"""
SYSTEM """You are a helpful assistant."""
PARAMETER stop "</s>"
PARAMETER stop "<|im_start|>"

As you can see there is more to the official model template than our default one. You can look into template interpretation Ollama Template Document and SeaLLM Usage.

Conclusion

Now you can download any GGUF model file from Hugging Face and run it through ollamawhich means you have REST API LLM running on your local. That is tons of possibilities!

Have fun hacking!

Recommended Model to Try Out

Interesting Discussion

More information on Thai Language usage with SeaLM

https://huggingface.co/SeaLLMs/SeaLLM-7B-v2/discussions/4

Source

--

--

No responses yet