I mostly understand the dilemma, but I want to see if someone has better success with their AI assistant. I use the Ollama integration and set up a conversation model. However, the default Home Assistant AI knows to use the home forecast entity whenever I ask about the weather. Whether I also set up an AI task model, toggle “control Home Assistant” on or off, or toggle “perform local commands” on or off - the Ollama models do not reference the home forecast the way the default Home Assistant can. I thought maybe keeping default commands on would enable this ability while answering all other queries with the Ollama LLM. I just want a smarter AI. Any suggestions?

  • Danitos@reddthat.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    25 days ago

    Not in HomeAssistant directly, but I made a flow in n8n where an LLM model can detect I’m asking about weather and call a weather API, parse the response and answer.

    • Dave@lemmy.nz
      link
      fedilink
      English
      arrow-up
      1
      ·
      25 days ago

      You can do something similar in Home Assistant.

      Add an integration to a weather service (there might even be one out of the box).

      Create an automation trigged by saying a sentence to your voice assistant.

      Set the automation action to be a conversation response, and set that to whatever entity contains the part of the weather you want it to say (or a template if you want it to say multiple or other fancy things)