
3.5 Years of Research in 7 Minutes: The Usability Gap
I’ve spent the last few years trying to diagnose a single, stubborn problem: why, despite all our standards and technology, do we still struggle to hand over buildings that actually work for the people who run and use them?
I have now synthesised 3.5 years of “grounded theory“- informed research, including hundreds of conversations and the collective wisdom of those I have engaged with, into a final Discussion Paper: The Usability Gap.
The Core Thesis: The industry is stuck because we confuse Information Management (IM) with Human-Centred Design (HCD):
– IM focuses on compliance (is the data there?)
– HCD focuses on usability (can the human find it in 3 minutes?)
The paper argues we must stop treating data as a “compliance exercise” and start treating it as a “Testable Product.”
It outlines a framework for a Minimum Data Handover Requirement (MDHR), complete with pilot test scripts for Maintenance and Energy. This builds on my recent Unifying the built environment: a framework for operational value article with Pete Swanson from Mott MacDonald for Digital Construction Plus:

The Video Summary:
I have embedded a 7-minute AI-generated video summary below that walks through the core arguments, the “Usability Gap” diagnosis, and the proposed solution.
The Audio Summary
If you prefer to listen, there is a 12-minute AI-generated podcast summary available below:
Acknowledgements & Context:
Producing this has been a labour of love, and I have taken this research as far as I can without formal funding.
The list of those who have fed into this and engaged with me is way too long to thank individually, but you can see a thin slice of that engagement on this blog, when I launched the o (among other things) sense-check my findings.
But I just wanted to say huge thank to everyone in the Start With Smart Group I set-up a couple of months ago for sense-checking my findings and those I have engaged with over the last few years for your support and intellectual rigour. This paper belongs to the community as much as it does to me.
The Full Paper
I am sharing the full paper below as a possible toolkit for the industry to use, critique, and hopefully adopt.
Discussion Paper: The Usability Gap
Closing the Divide between Project Delivery and Operational Reality
Executive Summary: A Toolkit for Closing the Usability Gap
The built environment faces a systemic challenge: the “Project-Operations Gap.” Despite the proliferation of rigorous technical standards, the industry continues to deliver data that is technically compliant yet operationally unusable.This Discussion Paper argues that the solution lies in a fundamental paradigm shift: we must stop confusing Information Management (IM) with Human-Centred Design (HCD). We must move from delivering abstract data assets to delivering testable information products.
This document is structured as a complete toolkit to help industry leaders understand, diagnose, and solve this problem:
The History (Part 1): Traces the origin of this insight through 3.5 years of “Grounded Theory” and industry observation. It identifies the cultural friction between the “Idealist” (Project Compliance) and “Pragmatist” (Operational Reality) mindsets that created this gap.
The Theory (Part 2 & Appendix A): Diagnoses the root cause. It articulates the critical distinction between IM and HCD and provides a detailed theoretical analysis of why current standards (such as FMS 002) fail to deliver usability without a user-centric bridge.
The Practice (Part 3 & Appendix B): Moves from abstraction to action. This section proposes the Minimum Data Handover Requirement (MDHR) as a solution. It includes specific, ready-to-use Pilot Test Scripts for Maintenance and Energy scenarios, allowing project teams to validate data usability before handover.
Part 1: The Origin Story – An Outsider’s Observation
The Context: 3.5 Years of Grounded Theory
I come to this problem not as a traditional engineer, but with a background in Human-Centred Design (HCD), underpinned by an MSc from a School of Computing, Engineering & Mathematics and a career of over three decades spent in digital transformation across other sectors.
For the past three and a half years—effectively the duration of a PhD—I have been conducting a form of “Grounded Theory” informed research on the Built Environment. I have interviewed hundreds of experts, co-written articles with and for industry leaders, curated panels at major expos (London Build Expo, Digital Construction Week, Smart Buildings Show, PropTech Connect, Realcomm and WorkTech), and facilitated high-level roundtables.
My goal was simple: to understand why, despite incredible technology and endless standards, the industry cannot seem to deliver buildings that work seamlessly for the people who operate them.
The Catalyst: Burnout and Clarity
In July 2024, I resigned as Executive Director of the Digital Buildings Council, which I helped launch, due to burnout. Stepping back gave me the distance I needed to see the pattern.
I realised the industry is trapped in a methodology clash. It is attempting to solve dynamic, operational problems using static, waterfall construction processes. It is trying to build “Smart” buildings without the “Lean Startup” or “Product” thinking required to make them usable.
To test this hypothesis, I founded the “Start With Smart” discussion group. What began as a small WhatsApp chat has exploded into a community of over 160 senior leaders from across the ecosystem—Contractors (Sir Robert McAlpine, Laing O’Rourke, Skanska, Multiplex), Consultants (Mott MacDonald, WSP, NDY), Tech Giants (Honeywell, Siemens, Johnson Controls, Hewlett-Packard), and major Operators (JLL, Sodexo, CBRE).
The “Inside-Out” Hunch
Through facilitating this group, I observed a fundamental tension. I call it the Idealist vs. The Pragmatist:
- The Idealist (The Project Mindset): Focuses on compliance, standards (ISO 19650), and comprehensive data schemas (COBie, IFC). Their goal is a perfect asset.The Pragmatist (The Operational Mindset): Focuses on keeping the lights on, answering complaints, and reducing energy bills today. Their goal is a usable tool.
The Gap: The industry assumes that if you satisfy the Idealist (Information Management), you automatically satisfy the Pragmatist (Operations).
My Conclusion: This is false. Information Management (IM) is not Human-Centred Design (HCD).
IM ensures data is compliant and stored. HCD ensures data is findable, understandable, and actionable by a human under pressure.
The Proposal
This paper argues that we must stop treating data as a “Compliance Deliverable” and start treating it as a “Testable Product.”
This thinking has been shaped by my collaboration with industry leaders, including co-authoring “Unifying the built environment: a framework for operational value” with Pete Swanson (Mott MacDonald), where we first explored the concept of a Minimum Viable Handover Requirement (MVHR).
The following pages outline The Diagnosis (The Usability Gap) and The Solution (The MDHR as an MVP), demonstrated through two practical user stories.
Part 2: The Diagnosis
Defining the Usability Gap
The Core Misunderstanding
There is a widespread belief that “doing BIM well” or “following ISO 19650” solves the operational data problem. It does not. These are Information Management (IM) standards. They manage the supply of data (compliance, structure, security). They do not address the demand (how a human interacts with that data).
IM vs. HCD: The Critical Distinction
Information Management (IM)
- Focus: The data asset (Compliance)
- Role: Back-office technical discipline
- Validation: Technical: Is the data structured correctly? (e.g. COBie schemas)
- Goal: Governance, security, storage
- The Gap: Creates a “Data Asset”
Human-Centred Design (HCD)
- Focus: The human operator (Experience)
- Role: User-focused design methodology
- Validation: Usability: Can the user perform the task? (UX testing)
- Goal: Efficiency, effectiveness, clarity
- The Gap: Creates a “Usable Too
The Usability Gap
Because we stop at IM validation, we hand over data that is technically accurate but cognitively inaccessible. This creates the Usability Gap.
- The Standard Approach: “Here is the spreadsheet with the data.”
- The HCD Approach: “Here is the interface that helps you make a decision using that data.”
- The Solution: Data as a Product
To close this gap, we must stop treating data as a static deliverable and start treating it as a Product. This requires a Minimum Viable Product (MVP) approach—defining a Minimum Data Handover Requirement (MDHR) and testing it with real users before handover.
Part 3: The “Straw Man” Tests: Two MVP Scenarios for Feedback
To the Engineers and Facility Managers reviewing this: These are hypothetical “User Tests.” If your projects passed these tests at handover, would it solve your operational headaches?
Scenario 1: The “Blind Technician” (Maintenance)
The User: Alex, a mobile tech with a tablet and no access to 3D models.
The Task: A “High Temp” alarm in Room 302. Identify the AHU and order the replacement filter.
The Compliance Pass (Current Status Quo):
- Alex searches “Room 302” in the app. No results. He checks a separate PDF drawing.
- He finds the SpareParts field. It reads: “Refer to O&M Manual Vol 4”.
- He goes to the plant room to physically measure the filter.
Result: 2 hours wasted. Data was compliant, but useless.
The MDHR/MVP Pass (The Proposal):
- Alex searches “Room 302.” The app immediately links to “AHU-N-04.”
- The SpareParts field reads: “Bag Filter F7 592×592”.
- He copies the code and orders the part from the van.
Result: Task done in 10 mins. Data was a Product.
Scenario 2: The “Phantom Load” (Energy)
The User: Sam, an Energy Manager.
The Task: Identify why the building used 400kW on Sunday when it should have been empty.
The Compliance Pass (Current Status Quo):
- Sam downloads CSVs from the Metering Cloud and BMS Cloud.
- Timestamps don’t align. She spends 4 hours in Excel creating a pivot table.
- She cannot see if the Chiller should have been on.
Result: “It’s probably the chiller.” (Low confidence).
The MDHR/MVP Pass (The Proposal):
- Sam opens the dashboard. It shows “Main Meter” vs. “Sum of Sub-Meters.”
- She drills down: Main → L4 Board → HVAC → Chiller 2.
- She overlays “BMS Enable Signal” (Off) vs. “Power Draw” (On).
Result: “Chiller 2 was in Manual Mode.” (High confidence).
Appendix A: The Theoretical Framework
Detailed Analysis: Closing the Usability Gap
Executive Summary
The built environment faces a systemic challenge: technical standards for Information Management (IM) are failing to deliver their intended value, particularly in supporting operational decision-making. This failure stems from a fundamental misunderstanding and conflation of IM with Human-Centred Design (HCD).
While IM provides essential frameworks for managing data as a compliant asset, it inherently lacks the user-focused validation central to HCD. This creates a critical “Usability Gap”—a disconnect between technically compliant data and information that is genuinely usable by human operators.
The solution requires a paradigm shift: IM must be repositioned as the “technical servant” to HCD. This can be achieved by treating data deliverables not as abstract compliance exercises, but as testable products, specifically a Minimum Data Handover Requirement (MDHR) tested as a Minimum Viable Product (MVP) with end-users in pilot projects
The Core Distinction: Information Management vs. Human-Centred Design
A central debate reveals a critical misunderstanding within the industry regarding the relationship between Information Management (IM) and Human-Centred Design (HCD). An initial assertion that “IM and HCD is the same thing” was refuted with the argument that they are distinct, albeit complementary, disciplines.
Recognising this distinction is crucial, as “misunderstanding this difference is precisely why the built environment is stuck in a project-focused loop.” The two fields differ in their curriculum, methodology, validation processes, and ultimate goals.
The Consensus
The two disciplines are not in conflict but must be integrated. A proper IM approach “should 100% contain” HCD principles. The ideal relationship positions IM “not as a destination, but as the technical servant of the HCD-defined operational purpose.”
Identifying the “Usability Gap”
The fundamental differences between IM and HCD create a critical vulnerability in project delivery: the Usability Gap.
Supply vs. Demand: IM standards focus on the supply side of information—ensuring consistent, high-quality data is structured correctly. HCD focuses on the demand side—how human users will interact with and consume that data.
The Missing Bridge: IM lifecycles lack the mandatory “Prototyping & Testing” phases that are inherent to HCD. These phases are the bridge that ensures a data product is not only compliant but also fit for its intended human purpose.
Data vs. Usable Tool: A standard can mandate the creation of a data asset, but it cannot guarantee that asset is a usable tool. An interface or application designed through HCD principles is what transforms raw data into actionable information.
Case Study: The UK Government FMS 002 Standard
The UK Government’s mandatory FMS 002: Asset Data standard serves as a clear case study for the Usability Gap.
Stated Goal: “To leverage better data to inform maintenance and investment decisions, support procurement, enhance compliance and safety, and drive improvements in service delivery.”
Methodology: FMS 002 mandates a defined data structure, taxonomies (Uniclass, NRM 3), and assurance frameworks (COBie). Its validation processes are technical assurance checks focused on data quality, completeness, and compliance.
The Gap in Action: The standard itself is a framework of rules, not a testable product. There is no requirement for user-experience testing with the facilities managers or maintenance engineers who are expected to use the data. A data drop can be perfectly compliant with FMS 002, yet be delivered through software so poorly designed that it actively hinders decision-making.
An Endemic Industry-Wide Problem
The Usability Gap is not unique to FMS 002 but is a systemic issue across the built environment’s standards and frameworks. These documents define rules and processes but are themselves abstract and untestable from a user’s perspective.
This challenge applies to a wide array of initiatives, including:
Process Standards: ISO 19650 (BIM), ISO 41001 (FM), ISO 55001 (Asset Management).
Data Standards: NRM 3 (Costing), SFG20 (Maintenance), BS 8587 (Facility Information).
Guidance Frameworks: The Smart Buildings Overlay to the RIBA Plan of Work, the Nima Information Management Initiative, and the buildingSMART IoT & Interoperability Project.
All these frameworks establish necessary technical pipelines but do not guarantee that the information flowing through them will be delivered in a usable format for human end-users.
Consequence: Fueling the Project-Operations Gap
The Usability Gap is a primary contributor to the industry’s Project-Operations Gap. The causal chain is as follows:
Project Phase: The project team focuses on delivering information that is technically compliant with IM standards to achieve contractual sign-off.
Handover: The data is transferred to the operations team. The Usability Gap is revealed as operators struggle to use the information via interfaces that were not designed or tested for their specific needs.
Operations Phase: Because the data is not easily actionable, it leads to inefficiencies, reliance on manual workarounds, and a failure to meet performance targets.
Plugging the Usability Gap is therefore a critical step in bridging the larger, systemic Project-Operations Gap.
Appendix B: Practical Application
MDHR Pilot Test Scripts
To move from abstract compliance to verifiable usability, projects must define specific “User Tests” during the commissioning phase. Below are two examples of Minimum Data Handover Requirement (MDHR) Pilot Scripts.
These scripts shift the definition of “Done” from “We transferred the files” to “The maintenance team successfully completed the operational task using the live system.”
Pilot Script 1: The “Blind Technician” (Maintenance)
Asset Air Handling Unit (AHU)
Scenario: A “High Temperature” complaint has been received for the North Wing offices. The Facility Manager needs to identify the serving asset, check its service history, and order a replacement filter without visiting the plant room physically.
The User (Persona): “Alex,” a Mobile Maintenance Technician. Alex uses a tablet, has spotty Wi-Fi, and does not know how to use Revit or Navisworks.
Part 1: The Minimum Data Requirements (The MVP)
For this test to pass, the following 10 data points must be available, accurate, and accessible via the FM interface:
- Unique ID: (e.g., AHU-N-04)
- System Association: (Which rooms/zones does it serve?)
- Location: (Which plant room/Grid reference?)
- Manufacturer: (e.g., Trane, Daikin)
- Model Reference: (Specific model, not generic)
- Serial Number: (Crucial for warranty/parts)
- Installation Date: (For lifecycle calculation)
- Warranty Expiry Date: (To decide who pays for the fix)
- Spare Part ID (Filter): (e.g., Bag Filter F7 592×592)
- Last Maintenance Date: (Dynamic data)
Part 2: The Usability Test Protocol
Instruction to User: “Start at the dashboard. You have received a complaint for Room 302. You have 5 minutes. Diagnose the unit and define the spare part needed.”
Part 3: The Pass/Fail Analysis
- Scenario A (Compliance Pass): The BIM Manager signed off because all COBie fields were populated. However, the SpareParts field said “See Manual Vol 4”. Alex failed the test because he couldn’t order the part from the tablet. Verdict: FAIL.
- Scenario B (MDHR Pass): The SpareParts field contained the specific string “Bag Filter F7 592×592”. Alex identified the unit was out of warranty and issued a PO within 3 minutes. Verdict: PASS.
Pilot Script 2: The “Phantom Load” (Energy)
Asset: Virtual Metering & BMS Overlay
Scenario: It is Monday morning. The main utility meter shows a significant energy spike on Sunday afternoon when the building should have been empty. The Energy Manager needs to identify the specific system responsible within 15 minutes.
The User (Persona): “Sam,” an Energy Manager. Sam understands kW and carbon factors but does not write code or merge CSV files manually.
Part 1: The Minimum Data Requirements (The MVP)
For this test to pass, the system must automatically present the following integrated data:
- Main Meter Feed: (Real-time consumption).
- Sub-Meter Hierarchy: (Parent/Child relationships defined—e.g., Main $\rightarrow$ L4 Dist Board $\rightarrow$ L4 Lighting).
- Naming Convention: (Functional names, e.g., “L4-North-Lighting,” not “MTR-004-X”).
- Unit Normalization: (All data converted to kWh automatically).
- Design Target: (The “predicted” energy usage for a Sunday).
- BMS State Overlay: (A binary data point showing if the equipment was commanded “On” or “Off”).
Part 2: The Usability Test Protocol
Instruction to User: “You see a spike on Sunday at 14:00. Drill down to find the specific load causing it and confirm if the equipment was scheduled to be running.”
Part 3: The Pass/Fail Analysis
- Scenario A (Compliance Pass): The meters work and data is valid. But Sam fails because she has to download two separate CSV files (Metering and BMS) and align timestamps in Excel. The process takes 4 hours. Verdict: FAIL.
- Scenario B (MDHR Pass): The “Virtual Meter” logic works. The system subtracts sub-meters from the main meter and visualizes the remainder. Sam identifies the rogue chiller in minutes. Verdict: PASS.