# [How C3] Private ChatGPT via Open WebUI
If you have multiple devices (smartphone, notebook and maybe even a desktop) and you want to be able to talk to your AI assistent wherever you are, you want to setup a server that hosts Open Web UI. This setup even supports multi users and can serve your whole family, just keep in mind that you're responsible for the data as every conversation is gonna be stored on your server. There

## Installation of Open WebUI
### Step 1: Setup your Server (Unraid 7)
There are many ways to setup a server. If you are totally new to this and don't have old hardware laying around I recommend you to get a N150 mini PC with 16GB RAM like the [Blackview MP20](https://de.blackview.hk/products/mp20-price) there are dozens of options out there. You'll also need an usb stick.

If you want to get something more beefy (maybe because you want to host more services aside from just Open WebUI) take a look at the [Minisforum MS-01](https://www.minisforum.com/products/minisforum-ms-01) or the AMD version [MS-A2](https://www.minisforum.com/products/minisforum-ms-a2) but keep in mind that only the MS-01 has an integrated vPro allowing you to manage the system remotely even when the system crashed or has been shut down. Last year I started to build [my new homelab](http://hackmd.io/@reneil1337/homelab) based on the MS-01 and can recommend!
Whatever hardware you are using, its time to boot into Unraid.
<iframe style="width:100%;display:inline-block;padding:0px" height="420"src="https://www.youtube.com/embed/HRn4GrcV6ck " title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
*Note: You can also use Proxmox, Coolify or even a simple Ubuntu 24.04 server OS to host Open Web UI. We use Unraid in this tutorial as its very beginner friendly.*
### Step 2: Install Open Web UI
To install Open WebUI head to the Unraid app store, search the app and hit install.

*Note: If you're installing on another device check the [Github Repo](https://github.com/open-webui/open-webui) and the [official Docs](https://docs.openwebui.com/getting-started/quick-start) to see how exactly you're installing the stack on the operating system of your choice.*

Depending on other stuff thats already installed on your system, you might encounter this popup. Just hit OK we'll fix the ports in the next step.

You'll be greeted by this screen. In case you've been receiving the popup, scroll down to see whats up with the port error message. Click "Show docker allocations" and search 8080 and you'll find an application thats already running on that port.

Scroll up and change the WebUI param from 8080 to somethings thats not yet in use - for example 8181. You won't need an OPENAI_API_KEY so delete whatever is set for that field. In case you're running a local Ollama instance you can put the url into OLLAMA_BASE_URL to access your local models aswell, for this demo I leave it blank.

Scroll down and hit apply. Unraid is now gonna download and install the application. Once everything finished successfully you can hit "Done".

Head to your Dashboard and find open-webui now click the icon and head into the WebUI. A new tab is gonna open and you'll be greeted by Open WebUI.

### Step 3: Plug comput3 into your OpenWebUI instance
On the welcome screen hit "Get started" and you'll be prompted to setup the admin account. Put in Name, Email and Password and hit Create Admin Account.

You're now logged in. Navigate to the Top right, click the profile picture and head into the Admin Panel. This is where you can manage your models and other stuff.

Click Settings > Connections and in the OpenAI API click the + button that says "Add Connection" on hover.

Insert the following two fields and then hit verify connection to see if your credentials works. Hit Save to complete this step.
- base url: `https://api.comput3.ai/v1`
- your comput3 api key (scroll to the bottom of this post in case you cant find it)

After you've saved the setting, you can hit the Open WebUI logo in the top left to go back to the landing page. Open WebUI automatically pulls all models that are available for your api key into the system and you're now able to converse with whatever model you like.

At this point you can access Open WebUI inside your local network/wifi by typing the ip address in your address bar on whatever other device. You'll be prompted to login with the credentials that you did setup in step 1.
If you want to learn more about Open WebUI there are tons of tutorials on youtube that explain to you how to integrate proper websearch, mcp servers or your own knowledge base. It's an incredibly powerful system and my daily driver for many things.
## Setup VPN Access to access from outside your home
The idea of a server / homelab is of course that you can access the installed services not just from your home network but also from your smartphone / notebook on the go. There are various solutions to this but one of the simplest is called Tailscale. This creates a private VPN network called "Tailnet" into which you can hook all your devices.
### Step 1: Install Tailscale in Unraid
Follow this guide to integrate Tailscale into your Unraid Server.
<iframe style="width:100%;display:inline-block;padding:0px" height="420"src="https://www.youtube.com/embed/t0ULcxDx51E " title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
### Step 2: Connect Open WebUI with your Tailnet
Once Tailscale is installed you head to the Docker page, click open-webui and hit edit

Here you flip the "Use Tailscale" switch to on and set a Tailscale Hostname. Scroll down and hit Done.

Now you wait until Tailscale has been installed into the Docker Container.

Hit "View Container Log" to open the console. Inside the console you click the authentication link at the top which is gonna open a new tab in your browser.

Login to your Tailscale account and then hit connect.

You'll be forwarded to your [admin dashboard](https://login.tailscale.com/admin/machines) where you'll find your machine. Disable key expiry as you don't want to deal with that. Then click the ip address and you'll see the domain via which the application is accessible from everywhere in your tailnet.

Thats it! Your Open WebUI server is now accessible outside of your home network.
### Step 3: Install Tailscale on your Devices
Download and install [Tailscale clients](https://tailscale.com/download) on all your devices from which you want to use Open WebUI. There are apps on the official Android and iOS stores and every OS like MacOS, Windows and Linux are supported.
Tip: After installation set Tailscale to auto connect on boot ensuring that all your devices are always connected to your tailnet. You can now access your "private chatgpt" from everywhere with the data being stored on your own server at home.
## Other comput3 Tutorials
Do more with your account. Visit and bookmark this continously growing [c3 tutorial page](https://hackmd.io/@reneil1337/comput3) that explains all sorts of things your comput3 account enables. From your own personal ChatGPT to agentic systems and media generation theres tons of stuff to explore.
## How to Find your Comput3 API Key
After you've [acquired](https://dexscreener.com/solana/34qhkhrhyningwruftqjnb2vfv8oqyqd5tvdwwde1man) and [staked](https://app.streamflow.finance/staking/solana/mainnet/DGZebyog1twdFaKwN2KZkBLxDkzX4rh7L61S6TAty8et) some $com token you can [login to your dashboard](https://launch.comput3.ai/) with your solana wallet and grab your API key at the bottom page. You can do way more things in this dashboard like spin up comfyui instances or launch dedicated ollama instances with broader model selection but in this tutorial we keep it simple.

If you run into issues join the [official comput3 Discord server](https://discord.gg/EFZPGm9qUu) and ask the community.