We are diving into some additions for the rules. In case you are missing an obvious feature, please don’t hesitate to share. This the place ;-). Some thoughts:
When-Then
No attribute update as a condition
Comparing attribute values as a condition
Flow
Sum, max, min, avg, med processors
An attribute history input to compare values or changes
Integrate AI LLM functionality. I noticed on Paul Hibbert’s YouTube channel that this is already feasible with HomeAssistant. The results are both astonishing and amusing. This functionality could be effectively utilized in conjunction with e.g. smart city lighting projects.
Good suggestion. We are consider gradient based modeling for forecasting as it potentially can apply to many cases where the initial physical model behind is known but the parameters are not. An example: estimate the energy consumption in a building. It relates to occupancy (time of day/day of week, remember the office project @michal) as well as outside temperature. Basically more advanced than multi-regression as it can handle non linear as well as interdependent parameters. Anybody thoughts about most relevant use cases and specific models to use?
You can easily run all open-source generative AI models on your laptop, even offline, and explore various scenarios. Watch this video for more details: https://youtu.be/29dwMiBiFio. Models are easily selectable, so it should be the case with the OpenRemote. Let the community try them all!
The best idea for integrating LLMs into OpenRemote would be to create some OR Agents that provide integration with various AI model providers, like OpenAI and Anthropic, or self-hosted/OSS options like GPT4all, OpenWebUI, etc.
But I’m not sure what exactly those LLMs would offer to a very much data-oriented platform like OpenRemote. What exactly are you thinking of? If OpenRemote is a provider for data, instead of a consumer of LLM-based APIs, then I’d understand that. Maybe when it matures, it would make sense to integrate Anthropic’s MCP, so that OpenRemote could serve its API and datapoints as an MCP server, which would then allow AI models to query OpenRemote for data, and then making that be the AI integration.
Another thing that came to mind, is extending the read attribute history processor with an optional input to set the specific time of the historic datapoint. Example usage: Get the used energy from the 2nd of june up to now.
However, this might be more at home at a custom dashboard widget:
Widget display: -Select two moments in time. -Show result
Widget settings: Select attribute(s) and select if the widget should display the difference, average or sum between the two moments in time.
Regarding AI, I also think that integrating LLM’s for rule decisions within OR will for many cases only result in a very energy-intensive solution, while the end result will not necessarily perform better than mathematical / rule-based approaches when the solution is unambiguous.
Specialized neural networks for forecasting however…
I’ve been playing with the flow editor in my head a little bit, and thought about my times with Simulink back in college.
Things that came to mind are:
Submodule creation: Wrap a part of the flow diagram into a function, effectively creating a new processor, can export in JSON format and reuse elsewhere. Would require function input and output processors. https://www.mathworks.com/help/simulink/slref/simulinkfunction.html
– Potential usecases: often repeated flows simple flows like unit conversion, but also whole physical model representations (where integrators/differentatiors are also usefull).
Simulation: With loaded forecast/future datapoints, you could use these datapoints to simulate the expected behaviour of your solution when you have a physical presentation created within the flow editor (digital twin for the buzzers). This would mean you would create a timeframe and timestep for which the rule should run, and calculate future datapoints for other attributes for each timestep. A practical use case would be to calculate the optimal sizing of some physical part that is still missing in your system and requires a large investment. These can be batteries, heating buffers, heater sizes, you name it.
It opens new possibilities, which not necessary are better, but different. OR can start thinking ‘outside-the-box’ And, of course, don’t use it to calculate 2+2
I think you are right that it can open up whole new possibilities But my point was that the rules section does not seem the logical place for implementation.
You could just feed the LLM the API scheme and let it control everything from there?