GMKTec Evo X-2 AI Mini PC ++ Ubuntu 25.10
Getting the Evo X-2 running on Ubuntu 25.XX (.10 at time of writing)
Intro
I got an Evo X-2 to create a homelab local-LLM one-stop-shop (Ollama + Open-WebUI, eventually some image generation front-end, but out of energy for now). Had a lot of weird issues and caveats getting this up and running. So figured I'd put the commands used here in case anyone else runs into the same stuff.
TODO
Add breakout snippet for Ollama service entry edits
Decide on image gen front-end / wrapper, install, troubleshoot, configure, update this
Commands Ran
# Get Ubuntu up to Date & grab curl
sudo apt update
sudo apt upgrade
sudo apt install curl
# Enable SSH (todo: harden)
sudo systemctl enable ssh --now
# Install Claude Code CLI to help troubleshoot things
curl -fsSL https://claude.ai/install.sh | bash
# Install ROCM (I think this worked)
wget https://repo.radeon.com/amdgpu-install/7.1/ubuntu/noble/amdgpu-install_7.1.70100-1_all.deb
sudo apt install ./amdgpu-install_7.1.70100-1_all.deb
sudo apt update
sudo apt install python3-setuptools python3-wheel
sudo usermod -a -G render,video $LOGNAME
sudo apt install rocm
# Install pyenv & set Python 3.12 to global version
sudo apt install git && curl -fsSL https://pyenv.run | bash
pyenv install 3.12
pyenv global 3.12
python --version # verify
# Install packages necessary for python dev (these also prevent issues with Open WebUI)
sudo apt install -y build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev curl git libncursesw5-dev xz-utils tk-dev libxml2-dev libxmlsec1-dev libffi-dev liblzma-dev
# Install & Configure ollama service (todo: snippet of service entry)
curl -fsSL https://ollama.com/install.sh | sh
sudo systemctl edit ollama.service # set ollama to serve on 0.0.0.0
sudo systemctl daemon-reload && sudo systemctl restart ollama
systemctl enable ollama --now
systemctl status ollama
# Install open-webui (defaults to 0.0.0.0, no extra config necessary)
pip install open-webui
# Clean slate
sudo reboot now Last updated