Where Should My Building’s AI Go?

Where Should My Building’s AI Go?
Where Should My Building’s AI Go?

Building managers aren’t always at ease introducing cutting-edge automation systems to the facilities for which they are responsible. But a confluence of computing advances, pandemic-driven mandates and evolving occupant expectations—especially around public health and the workplace experience—is pushing the profession to embrace new technologies faster than it ever has before.
While building managers recognize the opportunity to achieve unprecedented efficiencies, the technological decisions involved are far from their wheelhouse. Their expertise is commonly found in HVAC, lighting controls and chillers—not IoT, networking, and . . . gulp . . . artificial intelligence.
AI is a particularly game-changing technology that can also be especially intimidating due to its complexity and the lack of visibility into how it makes decisions. Often the first challenge when adopting AI for building automation is to answer a fundamental question: where does it go?
The AI now being deployed to make buildings more energy efficient, healthy, autonomous, secure and responsive to occupant needs started life as a cloud computing technology. The machine learning algorithms under the hood of these systems require significant computing power, both to train the algorithms and to call on them to deliver insights–a process called inferencing. Until recently, on-premises infrastructure rarely had the resources to effectively do those things.
But running smart building applications out of remote data centers had its own limitations. Connectivity, bandwidth costs, security and latency–the time it takes to send data to the cloud and back–could impact the system’s efficacy. If a machine, or an entire building automation system, is going to fail, the alarm and automated response need to be as immediate as possible.
That issue has largely been mitigated by a new generation of edge computing technology: infrastructure installed in facilities with the processing power demanded by these compute-intensive workloads.
Companies like FogHorn, founded seven years ago, have developed an Edge AI technology that creates new possibilities to digitally transform building operations. Johnson Controls acquired FogHorn at the start of 2022 and has now integrated the edge technology into its OpenBlue building automation platform.
By closing the on-premises capability gap, these edge devices provide an architectural component important for achieving the goal of running a building as efficiently and effectively as possible. With their availability, building managers thinking about implementing smart automation technology now almost inevitably confront the question of whether to deploy AI on-premises or in the cloud. For those facing this question, there are some simple rules of thumb to consider.
As we already mentioned, actions that need to be executed in real-time, or close to it, are common edge use cases. Smart automation systems that detect operational problems and automatically alert or respond to them tend to work best when latency is minimized as much as possible.
Anytime you want local control of a system, it’s also best to do it on the edge; turning off a machine or adjusting a control system from the cloud often runs into security and latency challenges.
Then there are data transit and storage costs to consider. Take for example a video monitoring system in which high-fidelity images from multiple cameras are analyzed by computer vision AI model, a popular AI application. Sending to and storing all that data in the cloud can quickly become cost prohibitive.
Other use cases aren’t as clear cut. Often building managers want a deeper understanding of how they’re operating based on AI analytics, or to run simulation exercises on a ‘digital twin’ version of their facilities. That kind of rigorous data analysis typically doesn’t need to happen in real-time, so it’s best executed in the cloud, where the customer can harness at any scale the most powerful hardware and software tools for the job.
Running AI on the edge might also not be the best choice if you’re responsible for running multiple buildings and need to correlate information between them. In that case, the cloud allows for a centralized data clearinghouse and command center. As a practical matter, a hybrid approach is typically employed where some initial processing in the individual buildings happens through Edge AI, and then cloud AI is run on the aggregated data from multiple buildings, possibly combining other data sources.
It’s important to remember these are decisions that building managers don’t need to make alone—your technology vendor should work with you to ensure AI is deployed where it will serve your unique needs best. And building managers certainly don’t need to be exposed to the complexity of AI and its underlying machine learning algorithms, but rather allow it to do its magic behind the scenes.
Oracle, like many organizations now initiating return-to-work policies at a large scale, sees the aftermath of the pandemic as a unique moment in which to introduce smart-building systems. After a couple years of pandemic-induced closures, employees insist on a physical workplace where amenities are at their fingertips, collaboration tools are ubiquitous, air quality is monitored, crowding is limited, and their companies are meeting sustainability goals in their use of energy and water and reduction in waste. And with buildings still at historically low occupation rates, powering off systems that don’t need to be running helps deliver considerable gains in efficiency. 
These shifting workplace dynamics and expectations can be an opportunity to assess new investments in Internet of Things (IoT) technologies, the advanced networks that connect them and the AI systems that control them—decisions made based on occupancy, employee experience needs, ownership of the sites and their criticality (e.g., a research lab compared to office space).

Unlike in the past, building managers are prioritizing utilization metrics over schedules as the key consideration by which they invest in automated control systems. They can’t take for granted that everybody will come back: many companies are adopting hybrid work policies, and, for the first time, the office needs to compete against the home as an attractive and productive work environment.
Seasoned building managers are scrambling to learn the new skills required for these modern operations. They know that with AI on their side, running either on the edge or in the cloud, they might just have a leg up in encouraging employees back to the office by offering a safe, sustainable environment in which to meet face-to-face with their colleagues and customers, gather around real water coolers, and have far fewer cats and kids making guest appearances at meetings.

About The Author

Francisco Ruiz is the global director of IoT and an Infrastructure Strategist for Real Estate and Facilities at Oracle. He leads a global team that’s responsible for the strategy, design, security and implementation of enterprise IoT Solutions across an evolving 22 million square foot Real Estate Portfolio. He’s recognized as a thought leader and subject matter expert in Digital Transformation and is a frequent speaker on IoT related areas of expertise.  During the last two decades he’s advised various Real Estate teams along their digital transformation journeys optimizing operations, portfolio utilization and developing smart building strategies for the Workplace of the Future. He’s also contributed to various Industry firsts, patents for continuous commissioning and is a board member, advisor and participant in various thought leading CRE Tech Organizations including Realcomm and CoreNet.

Sastry Malladi is currently vice president of Open Blue Solutions AI organization responsible for the Edge strategy/development and AI capabilities. Sastry joined JCI through FogHorn acquisition, where Sastry was co-founder and CTO. Sastry built the Engineering/Product/Technology organization from scratch at FogHorn and was responsible for the product strategy, development and customer engagement and shared many corporate responsibilities with the CEO. Sastry is a results driven technology executive with deep technology and management experience of over three decades. His areas of expertise include developing, leading and architecting various highly scalable and distributed systems, in the areas of AI, Edge Computing, Industrial IoT, Cloud Computing, Big Data, SOA, Micro Services Architecture to name a few.

Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..