![](/static/253f0d9/assets/icons/icon-96x96.png)
![](https://midwest.social/pictrs/image/282c379c-7e70-43db-81d3-c50a0e47f1cc.png)
2·
6 months agoThese are a great example that I might use in the office. Everything makes sense in isolation, but the unity the wind, waves and sails don’t quite match in a way I couldn’t put my finger on.
These are a great example that I might use in the office. Everything makes sense in isolation, but the unity the wind, waves and sails don’t quite match in a way I couldn’t put my finger on.
I can’t decide if I want this to have been written by an AI or not.
Better than that, if you are after more than one (and with GU10s, who isn’t?)
This gives you 3 bulbs and a handy remote that also works with HA.
I don’t know how tech savvy you are, but I’m assuming since your on lemmy it’s pretty good :)
The way we’ve solved this sort of problem in the office is by using the LLM’s JSON response, and a prompt that essentially keeps a set of JSON objects alongside the actual chat response.
In the DND example, this would be a set character sheets that get returned every response but only changed when the narrative changes them. More expensive, and needing a larger context window, but reasonably effective.