this post was submitted on 11 Mar 2026
16 points (90.0% liked)

homeassistant

18792 readers
21 users here now

Home Assistant is open source home automation that puts local control and privacy first.
Powered by a worldwide community of tinkerers and DIY enthusiasts.

Home Assistant can be self-installed on ProxMox, Raspberry Pi, or even purchased pre-installed: Home Assistant: Installation

Discussion of Home-Assistant adjacent topics is absolutely fine, within reason.
If you're not sure, DM @GreatAlbatross@feddit.uk

founded 2 years ago
MODERATORS
 

I mostly understand the dilemma, but I want to see if someone has better success with their AI assistant. I use the Ollama integration and set up a conversation model. However, the default Home Assistant AI knows to use the home forecast entity whenever I ask about the weather. Whether I also set up an AI task model, toggle “control Home Assistant” on or off, or toggle “perform local commands” on or off - the Ollama models do not reference the home forecast the way the default Home Assistant can. I thought maybe keeping default commands on would enable this ability while answering all other queries with the Ollama LLM. I just want a smarter AI. Any suggestions?

you are viewing a single comment's thread
view the rest of the comments
[–] Dave@lemmy.nz 1 points 5 days ago* (last edited 5 days ago)

You can do something similar in Home Assistant.

Add an integration to a weather service (there might even be one out of the box).

Create an automation trigged by saying a sentence to your voice assistant.

Set the automation action to be a conversation response, and set that to whatever entity contains the part of the weather you want it to say (or a template if you want it to say multiple or other fancy things)