goss graphic for flashblog

Design Optimization Flashblog: A Few Words about Process Analytics

By Tyler Goss of CASE on May 06, 2014

This post is part of the BIMForum Flash Blog!

I recently gave a talk at BIMForum’s spring event in Boston, and I figured that is as good a reason as any to restart my efforts over here at buildingdatum. If you’re into seeing what I look like as I meander through a presentation, you can watch it here. And if you want to see me kicking it onstage with BIM legend Patrick MacLeamy, that’s right here.


For those of you that follow me in other spaces, you’ll already know that I’m fond of describing the building lifecycle as a series of informational transactions. A project begins life as an idea, which could take the form of a sketch, a pro forma, a mission statement, or some otherwise clear enumeration of the owner’s goals for the building process. This kernel is shepherded through a series of domain experts, each ostensibly adding value to the process as they go, until that initial idea has become a complex arrangement of thousands, millions, or even billions of inter-related and interdependent assemblies – all hopefully in service of that initial goal. And in theory, this information flows freely, abundantly, and precisely from expert to expert.

In reality, things are a bit murkier.

The social, technological, and legal constraints under which our industry operates all contribute to this fog of war. Tedious data gathering capacities and practices inhibit teams from developing integrated approaches to business processes like estimating, sequencing, or facilities management. The ecosystems that we use to develop our designs and plan our construction are often closed or discrete by design. Limitations to accountability for the standard of care in contractual documents create unreliable data sets and concomitant “garbage-in-garbage-out” problems. And of course, lest we forget the oldest chestnut of industry tribal knowledge, each project is a “unique snowflake” with idiosyncratic constraints and opportunities that will never quite transliterate to other projects.

But while every project may be superficially unique, they are nevertheless comprised of a series of highly-granular highly-repeatable processes – processes that can be measured, analyzed, and optimized across a wide range of projects. Processes that can be measured with very simple and generalizable rate variables – clashes resolved per manhour, design elements created per day, and so forth – are foundational to true improvement. Ultimately, creating greater understanding of the internal nanoeconomies of design and construction has a huge value for our industry. By analyzing our processes rather than our outcomes, we can start the process of removing – empirically – the waste that is distributed and hidden within our informational systems and our projects.

[Tyler’s original post can be found here].