Just a stranger trying things.

  • 0 Posts
  • 8 Comments
Joined 2 years ago
cake
Cake day: July 16th, 2023

help-circle
  • One thing which I find useful is to be able to turn installation/setup instructions into ansible roles and tasks. If you’re unfamiliar, ansible is a tool for automated configuration for large scale server infrastructures. In my case I only manage two servers but it is useful to parse instructions and convert them to ansible, helping me learn and understand ansible at the same time.

    Here is an example of instructions which I find interesting: how to setup docker for alpine Linux: https://wiki.alpinelinux.org/wiki/Docker

    Results are actually quite good even for smaller 14B self-hosted models like the distilled versions of DeepSeek, though I’m sure there are other usable models too.

    To assist you in programming (both to execute and learn) I find it helpful too.

    I would not rely on it for factual information, but usually it does a decent job at pointing in the right direction. Another use i have is helpint with spell-checking in a foreign language.


  • Regarding photos, and videos specifically:

    I know you said you are starting with selfhosting so your question was focusing on that, but I would like to also share my experience with ente which has been working beautifully for my family, partner and myself. They are truly end to end encrypted, with the source code available on github.

    They have reasonable prices. If you feel adventurous you can actually also host it yourself. They have advanced search features and face recognition which all run on device (since they can’t access your data) and it works very well. They have great sharing and collaborating features and don’t lock features behind accounts so you can actually gather memories from people on your quota by just sharing a link. You can also have a shared family plan.





  • The interface called open-webui can run in a container, but ollama runs as a service on your system, from my understanding.

    The models are local and only answer queries by default. It all happens on the system without any additional tools. Now, if you want to give them internet access, you can, it is an option you have to setup and open-webui makes that possible though I have not tried it myself. I just see it.

    I have never heard of any llm “answer base queries offline before contacting their provider for support”. It’s almost impossible for the LLM to do it by itself without you setting things up for it that way.


  • whats great is that with ollama and webui, you can as easily run it all on one computer locally using the open-webui pip package or in a remote server using the container version of open-webui.

    Ive run both and the webui is really well done. It offers a number of advanced options, like the system prompt but also memory features, documents for RAG and even a built in python ide for when you want to execute python functions. You can even enable web browsing for your model.

    I’m personally very pleased with open-webui and ollama and they both work wonders together. Hoghly recommend it! And the latest llama3.1 (in 8 and 70B variants) and llama3.2 (in 1 and 3B variants) work very well, even on CPU only, for the latter! Give it a shot, it is so easy to set up :)