Hi, currently have a spare GeForce 1060 lying around collecting dust. Planning
to use it with Ollama [https://ollama.com/] for self-hosting my own AI model or
maybe even for AI training. Problem is, none of my home lab devices have a
compatible connection to the GPU’s GPIO. My current setup includes: - Beelink
MINI S12 Intel Alder Lake N100 - Raspberry Pi 5 - Le Potato AML-S905X-CC - Pi
Picos Would like to hear about recommendations or experiences with external GPU
docks that I can use to connect my GPU to my home lab setup, thanks.
I haven’t used any but have researched it some:
Minisforum DEG1 looks like the most polished option, but you’d have to add an m.2 to oculink adapter and cable.
ADT-Link makes a wide variety of kits as well with varying pcie gen and varying included equipment.