Are you looking for an easy way to run LLM on your local environment? Ollama seems like a straightforward option. Open-WebUI (formerly known as Ollama-webui) even gives you a chatGPT-like web interface! To get Ollama up and running, you just need to follow a few simple steps.
Step 1: Install Ollama locally and deploy your preferred LLM. For example, you can run the command
$ ollama run llama2
to download the llama2 model. Ollama supports a variety of models available on ollama.com/library.
Step 2: Run Open-WebUI to get a web interface. To do this, run the command
$ podman machine start, then
$ podman run -d -p 3000:8080 --network slirp4netns:allow_host_loopback=true -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main.
This will start the Open-WebUI container and map port 3000 on the host to port 8080 in the container, allowing incoming HTTP traffic on port 3000 to be forwarded to the container. Now, open your browser and navigate to http://localhost:3000 to access the web interface.
Explain me the following command.
podman run -d -p 3000:8080 --network slirp4netns:allow_host_loopback=true -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
- -d: Run the container in detached mode, meaning it will run in the background and not interact with the terminal.
- -p 3000:8080: Map port 3000 on the host to port 8080 in the container, allowing incoming HTTP traffic on port 3000 to be forwarded to the container.
- --network slirp4netns:allow_host_loopback=true: Use the slirp4netns network plugin, which allows for host-to-container loopback connections. This is useful when developing applications that need to communicate with other containers or the host machine.
- -v open-webui:/app/backend/data: Mount a volume from the open-webui directory inside the container to the /app/backend/data directory on the host. This allows data to be shared between the container and the host.
- --name open-webui: Set the name of the container.
- -restart always ghcr.io/open-webui/open-webui:main: Restart the container if it crashes or is terminated, using the ghcr.io/open-webui/open-webui:main image. The :main part of the image name indicates that this is the main image for the application.
Pretty cool, right?
- Ollama - https://github.com/ollama/ollama
- Open-WebUI (formally known as Ollama-webui) - https://github.com/open-webui/open-webui
- podman machine list
- podman machine stop
- podman machine start
- podman container list
- podman container rm open-webui
- ollama serve
lsof -i:3000
1 comment:
Such an informative post Thanks for sharing. We are providing the best services click on below links to visit our website.
DataScience with Generative AI Course
DataScience with Generative AI Training Hyderabad
Data Science Course Online Training
Data Science Training in Hyderabad
Data Science Training in Ameerpet
Data Science Training Institutes in Hyderabad
Data Science Course Training in Hyderabad
Best Data Science Course in Hyderabad
Data Science with Generative AI Online Training
Best Data Science Training Course
Post a Comment