Self-Hosting Perplexica and Ollama

Blacknight318 - Jul 8 - - Dev Community

Perplexica and Ollama Setup

Are you in the self-hosted camp, enjoying Ollama, and wondering when we'd have something like Perplexity AI but local, and maybe a bit more secure? I had been keeping an eye out when I came across an article on MARKTECHPOST about Perplexica. So I decided to take a crack at it. There were a few issues I encountered which we'll work around in the Perplexica setup, aside from config there was a call property that we need to address. Let's dive in.

Ollama Install and Setup

To begin with Ollama, follow these steps:

  1. Run the installation script using

    curl -fsSL https://ollama.com/install.sh | sh
    
  2. Pull the latest version of Llama3 using

    ollama pull llama3:latest
    
  3. Pull the latest version of Nomic-Embed-Text using

    ollama pull nomic-embed-text:latest
    
  4. Edit the Ollama service file by running sudo systemctl edit ollama.service and adding the following lines

    Copy Code
    [Service]
    Environment="OLLAMA_HOST=0.0.0.0"
    
  5. Reload the systemd daemon using

    sudo systemctl daemon-reload
    
  6. Restart the Ollama service using

    sudo systemctl restart ollama
    

Perplexica Setup

To set up Perplexica, follow these steps:

  1. Clone the Perplexica repository using

    git clone https://github.com/ItzCrazyKns/Perplexica.git
    
  2. Copy the sample configuration file to a new file named config.toml

    using cp sample.config.toml config.toml
    
  3. Open config.toml in a text editor (such as nano) and make the following changes:

    Change OLLAMA = http://server-ip:11434
    

    Comment out the server for SEARXNG and press CTRL+X to exit and Y to save

  4. Open ui/components/theme/Switcher.tsx in a text editor (such as nano) and make the following changes to line 10

    const ThemeSwitcher = ({ className, size }: { className?: string; size?: number }) => {
    

    Then press ctrl+x, then y to save the file

  5. Open docker-compose.yml in a text editor (such as nano) and make the following changes

    SEARXNG_API_URL=http://server-ip:4000
    NEXT_PUBLIC_API_URL=http://server-ip:3001/api
    NEXT_PUBLIC_WS_URL=ws://server-ip:3001
Enter fullscreen mode Exit fullscreen mode
  1. Build and start the Perplexica container using

    docker compose up -d --build
    
  2. Access Perplexica by visiting http://server-ip:3000 in your web browser

That's it! With these steps, you should be able to set up both Perplexica and Ollama on your system. If you found this helpful please share this post, donate to my Buymeacoffee, or clap if you're reading this on Medium. Till next time fair winds and following seas!

. . . . . . . . . . . . .