Supporting innovation in the built asset industry
Research Overview
At its core, the University of Toronto Center for Intelligent Buildings Digital Twinning (IBDT) aspires to bring about a transformative shift in how artificial intelligence is used in the built environment.
Overall, the IBDT intends to guide the processes of design, construction, and operational planning of buildings (in U of T and beyond) by means of understanding the data generated by the building (i.e., the what); analyzing the impact of operational parameters on performance (i.e., the why); using machine learning to predict future conditions (i.e., what's next); simulating the impacts of changes in policy and operational scenarios on performance (i.e., the how); and doing all of this in the virtual world before taking action in the real world.
In addition to conducting research work, the IBDT is creating a data portal for UofT facility data, which will enable facility users and other researchers at UofT to access and use data about each building, their performance, and operational characteristics.
Intelligent Buildings and Digital Twins
Before exploring how the IBDT intends to achieve its goal of virtualizing built assets, it is important to establish a common definition for two fundamental concepts: intelligent buildings and digital twins.
In simple terms, a “connected” building is one which is equipped with sensors and IoT systems for data collection, while a “smart” building uses analytics of such collected data to control building operations. On the other hand, an intelligent building employs data analytics to profile its users' activities and become aware of its own conditions. An intelligent building aims to interactively adapt to a building users' needs and provides them with suggested actions to promote both wellbeing and optimized productivity, all while enhancing the overall sustainability of the building.
The definitions of digital twin vary between researchers and practitioners. At IBDT, we adopt a definition that describes a digital twin as a digital replica of the actual conditions and possible states of the three key facets of a facility: physical, (decision) processes, and people. A digital twin can be understood as a tool which combines analytics, process modeling, and building-user interaction to imagine alternative decision-making protocols, examine scenarios for optimizing energy and user wellbeing, and, most importantly, a tool to engage building users in such a way that empowers them to lead the innovation of new operational schemes (i.e., by means of providing insights obtained from data analytics and artificial intelligence).
Why Do We Need Digital Twinning?
Advancing the built asset industry in Canada has extensive potential. To begin, the stakes are high. According to NRCan [1] , Canada has over 400,000 institutional buildings, accounting for 18% of emissions. Investments in rehabilitation by the Federal Government only are estimated at $3 billion.
The efficiency of current asset management tools is lagging. As a case in point, in 2012-2018 and – despite substantial investments in asset management tools for water systems, the watermain break rates in North America increased by about 27% [2]. The social costs of these events have been found to reach as high as 400% of construction costs on certain projects [3].
Finally, the rewards for increased efficiency in decision making in the built asset industry are significant. Proactive maintenance is 12-18% cheaper than reactive maintenance[4]. The financial return on investment on timely maintenance decisions is estimated to be 10 times that of delayed reactive maintenance [4].
The built asset sector pathway to enhance productivity, advance operational efficiencies and to decarbonization is complicated due to systemic problems.
-
Complex decision making: Owners (especially of residential and small commercial units), most contractors and many consultants lack knowledge about how to decarbonize their facilities and how to make decisions and how to manage projects effectively.
- The sociotechnical nature of decision making: The decision making is based not only on technical or even financial considerations, but on a complex social, regulatory, and economic set of criteria.
- Context-dependency: stakeholders consider and evaluate a multitude of options, which are highly dependent on the context of the setting, the nature of the facility and the profile of owners.
-
Outdated supply chain:
- Fragmented industry structures: to design and rehabilitate a facility, an owner has to navigate an industry that is dominated by short-term, project-based thinking.
- Risk-transfer contracting: for long, the industry has been dominated by a bidding system that is based on contentious transfer of risks between parties, which promotes mistrust; and punishes innovation.
- ad hoc processes: Even if a technology solution is clear for a specific context, a set of ill-defined processes must be managed to realize the decarbonization solution. These range from outdated permitting processes, complicated contracting, and a construction process laden with schedule and budget overruns.
-
Digitization backlog: obviously, digitization can be a great help in addressing the decarbonization challenges. However, this is faced with several key challenges:
- Level of usage: the construction industry has some of the lowest levels of utilization of information technology resulting in inefficient performance. In fact, it is the only economic sector that has lost productivity over the last 40 years, while every other sector gained.
- Orientation: the majority of IT systems in the construction industry target engineers and architects and focuses on increasingly deeper technical analyses. Very little work has been directed at other stakeholders, particularly owners and facility users.
- Technical vs business tools: Limited work has been done on the business side of the industry (managing supply chains for example). There is also a significant lack of awareness of the socio-economic nature of the problem (the need to profile users and address their expectations)
- The dominance of top-down thinking. The domain, particularly BIM, has been dominated by a belief in the role of experts in setting data models and in developing rules. This also has been dominated by a thinking oriented towards using structured data only (and even structuring unstructured data as the only solution to handle text and other unstructured data). This goes against the prevailing belief that unstructured data is a key source for knowledge capturing; and in contrast to the ad hoc and subjective nature of construction data.
- Outdated data sharing and governance systems: there is a common resistance to sharing data in the domain—possibly because of the perception that sharing data can reduce competitiveness. Similar concerns exist in other domains and rational solutions were introduced to highlight that data sharing (and even coopetition) is better for business and for the whole sector in the long run. The lack of utility of IT systems to non-technical teams, and the neglect of the business and socio-economic aspects of the sector did not help too.
The Opportunity with the University of Toronto’s Digital Twin
The following features make our dataset unique:
- Data triangulation: longitudinal structured data (e.g., IoT data), unstructured data (e.g., complaints, and maintenance logs), and building contextual data (e.g., weather and community profile).
- Data reliability: The engagement of facility operators will provide researchers with a "ground truth” view of data, which can help overcome data reliability and completeness issues.
- Higher-value data: providing access to the data for other researchers working on other topics such as energy and air quality will generate new sets of insightful data (e.g., generate new simulations of possible scenarios).
- Occupants & their data: For a long time, informatics research in the domain has focused on professionals. The IBDT will have access to the (typically illusive) occupant data; and to occupants themselves (i.e., innovative stakeholders with unique knowledge profiles who can create ideas and identify new applications).
Models in the Digital Twin?
A digital twin uses IoT data, functional data, management data, and user input. In addition, digital twins also use several “technical” models as input, such as a BIM (Building Information Model) or the predictive models for hardware failure (increasingly provided by BAS), an energy simulation model (e.g., developed by the designer). A digital twin can help generate four types of “virtualization” models of the building.
-
Descriptive models: These models attempt to use facility data to understand the ongoing conditions of a facility.
For example, a DT can collate BIM and facility data to generate an energy model for building utilities or calculate its carbon footprints.
-
Predictive models: These models use Artificial Intelligence tools to predict what will/can happen with the existing conditions of a building/asset.
For example, cost trends and expected budget overruns; productivity patterns; detecting safety-sensitive tasks and situations.
-
Prescriptive Models: Prescriptive models are where a DT shifts from visualization to virtualization. These models aim to experiment with potential scenarios in the virtual world before implementing them in the real world and therefore, answer questions like 'what will happen if' or 'what can be done'.
For example, replicating maintenance policies under new assumptions or revised technical architecture, or the addition of new equipment.
-
Generative Models: Prescriptive DT engages stakeholders in co-imagining futures. Prescriptive DTs (Digital Twin) can also be generated by computers, where we can realize options that are unseen, unimaginable, or more efficient.
Thanks to increasing adoption of process-aware information systems, Generative Design Workflow (GDW) can generate workflows that examine new decision criteria or actor roles; support adaptive tasking.
Instead of using DT to provide reactive information on facility management issues, GDW can generate/imagine new workflows that integrate business intelligence in decision making.
Research Areas
To do the much-needed rehabilitation of our buildings, we must generate new knowledge and re-develop decision systems in the sector. The sector relies on rules of thumb (the famous 2-4% annual investment target) or standard formulae, such as the omnibus deterioration curve. With the existing outdated tools, a decision-maker must consider a complex set of goals: services (LOS), energy saving, climate impacts. In fact, over the last four decades all industries have achieved gains in their productivity, except the construction industry.
We need new tools. We also need to update the existing expert-based mentality with one of learning from real-world practice. By virtualizing futures, digital twinning supports decision-making for choosing the “right” project by allowing to study the impact of a project on tangible (i.e., built assets) or intangible (i.e., environmental, social) systems over its entire life cycle. Digital twinning also helps build a project the right way. By pooling the expertise of all stakeholders and with insights from machine learning, we can improve safety, productivity and reduce delays and costs.
The unprecedented dataset of about 150 buildings at UofT allows us to explore approaches that rely on data analytics and machine learning. This can lead to a breakthrough in informatics research in the domain, which has been dominated by normative thinking.
Empowerment
IBDT is built with a belief in knowledge co-creation and occupant empowerment. Instead of tokenized consultation with citizens (or even handing them the right to choose between options), co-creation powers the Net-savvy generation with analytics tools, engagement activities, and reengineered work processes to co-generate knowledge and invent new solutions – ones that address issues of interest to them (e.g., climate change).
We want to put the users, their socio-economic needs, and their business requirements at the forefront. Imagine if the owner of a small strip mall or a single-family house can plan, re-design and decide on decarbonizing their facilities by uploading a file to an interactive portal. Imagine if all that is required is that the user write a set of short sentences describing how they like their facility to be re-made, then generative computing bots can transform the text into a digital layout of the facility. Imagine if we collect data about recent rehabilitation projects and use data analytics to “learn” what are the best means to rehabilitate every type of building. Imagine if we can predict the costs of conducting such rehabilitation.
Data Analytics and Machine Learning
The use of AI is not just about the impressive crunching of numbers. It is about letting real-world data inform us of patterns and new discoveries without the limitations of supervisory experts telling the computer what is right. It is about being agnostic to data types and structure. It is about letting people access and use sophisticated tools using simple input and friendly interfaces while keeping the sophisticated analytics and number crunching hidden. Ultimately, it is about empowering people to focus on sharing ideas, learning new knowledge, exploring different ideas, focusing on outcomes not just means, and co-generating (different) futures.
The IBDT will conduct research on the use of machine learning to advance data-driven culture in the built environment, with particular emphasis on the use of analytics to generate business intelligence tools for enhancing project commissioning, advancing sustainable practices, and exploring new horizons in human-building interactions. The agenda spans four modes:
- Deployment: use BIM-based software to establish reliable means for data collection, sharing, and access.
- Analytics: Integrating IoT data, project documents, stakeholders' input to create soft-sensing mechanisms, where algorithms can detect new facts or events. Use predictive analytics to support proactive decision making. Use prescriptive analytics to assess the potential of future work/operations scenarios.
- Experimental: act as a testbed for advanced practice in process automation and streamlined information flows – deliver the right data to the right person at the right time.
- Explorative: investigate the potential of inductive programming and cognitive computing to advance virtualization, the concept of interactive buildings, and algorithmic governance of facilities
Data Governance & Policymaking
Data governance is not data access management. In addition to access, governance encompasses the policies and tools for advancing the role of data in organizational decision making; rational collection of data; reliable quality assurance; promoting transparency; empowering users not only to access and understand the data, but also use data to generate insights.
In general, and specific to the built asset sector, there has been a hesitation to sharing data in the industry. In many cases this is due to uniformed traditions. It also relates to the perception that data sharing compromises competition, security or privacy. By progressively sharing its data and supporting research into data governance in the domain, UofT establishes a model that recognizes that i) transparency and data sharing has value not just to industry but the whole communities; ii) there are means to be transparent, while addressing the concerns about security and privacy.
IBDT will demonstrate the use of best practices in data governance; investigate policies for the collection and audit of data; examine the applicability of open data systems, including generating security/access rights best practices; and participate in realizing the data governance vision for UofT.