TT+20: Harnessing The Power Of Big Data

TT+20: Harnessing The Power Of Big Data

| Aug 12, 2015

Earlier this year, international structural engineer Thornton Tomasetti released an annual report that was much more than just numbers and pats on its own back. Instead, the NYC-based firm spoke with several industry experts outside its own doors and surveyed its own 850 employees in 26 offices worldwide. The result was a series of thoughtful articles about where the global AEC industry will be in 20 years. Here, we present the final of four excerpts.

What is big data? what can we do with it?

The term “big data” emerged in the 1990s in the computer graphics industry in Silicon Valley and was quickly adopted by businesses, like Google and Amazon, that amassed vast amounts of information about users. Today, big data is everywhere, and is defined by three V's and one A:

  • Volume:  “Big” starts in the terabyte range and goes up;
  • Variety:  Big data is a mishmash. In our work it might include photos, blogs, chats, reports, analyses, models, sensor data and scanned paper documents;
  • Velocity:  The rate at which data change or are updated. Big data has a high velocity (like Amazon’s 426 transactions per second during the 2013 holiday season) and no AEC data yet approximates the big data velocities;
  • Analysis:  To make sense of big data and find patterns requires an enormous amount of RAM – often more than any organization will have, unless it’s NASA. Big data analysis usually requires cloud computing; 

You can’t find true big data in the AEC world – yet. What you can find are the precursors, and some big data techniques starting to appear. There are several enablers of this change: the falling cost of computing power, storage and network connectivity; the availability of cloud computing; and perhaps most importantly, openness to experimenting with how technology is changing human roles and relationships. Here are some of the elements we see in Big Data...

      ___________________________________________                      Top ball? All TT data = 75 TB

      ___________________________________________        

              Top ball? All TT data = 75 TB

TTX: Universal translator

Thornton Tomasetti was an early adopter of building information modeling (BIM), and we soon saw the need to write translators to move design, analysis and fabrication information automatically from one platform to another.

 As the number of software packages proliferated, maintaining a library of translators soon became impractical. The translators were missing a critical bit of functionality: the ability to update the model. So whenever there was a program update, the translator needed an update as well.

Instead, we developed a kind of software Esperanto we called 'TTX', a database that all AEC platforms can talk to. So far, we have Grasshopper, Revit, Tekla, SAP, RAM and ETABS all talking to each other and allowing cross-platform model updates. Although the original intent of TTX was interoperability, it is increasingly being used for iterative capturing of analysis data.

TTX saves all data in one database, rather than as separate files in separate folders. This opens the door to new kinds of analyses. Previously, we could tell differences between models; with TTX, we can understand details of those differences. The result of this analysis will be more modeling iterations in less time, on a universal platform that the entire project team can share.

Borrowing from a big data analysis approach, we have a prototype web-based tool, too, that allows TTX users to run real-time reports on their models and easily sync different models used during different project phases. This will lead to easier transitions between project phases.

Daylighting, point clouds & laser scanning

Daylighting analysis requires large data sets at two stages: data acquisition and data analysis. Acquiring daylighting data for one room typically involves sampling 50 data points per hour, per day, for 8,760 hours (one year). An even larger data set is developed during ray tracing, which finds the relationship between daylight and interior illumination in a defined space. Depending on the density of sampling, volume of space, and number of spaces sampled, ray-tracing data may be several orders of magnitude larger than the raw data. This can still be processed locally, but large-scale sampling of large buildings could easily require a cloud solution.

Laser scanning, a technique that creates an undistorted “point cloud” 3D image, is finding application in forensic analysis. A study may require dozens of point-cloud scans, each averaging about 10 GB. Laser scanning allows for easy measurement of component position and dimensions, and comparison of changes over time. We have used laser scanning to quantify floor deflection following removal of a heavy curtain wall, and to help façade engineers prepare a lighter-weight replacement that accommodates the new floor position. We have also used it to diagnose the cause of wall bulging in an arena, to rule out deformation from excessive roof-rigging loads.

In the near future, the variety of data will increase as photographs, MEP equipment condition, flood levels and other data are incorporated into scans. Further down the road, application of finite element analysis will enable us to determine stress conditions in nonlinear geometries, such as sculptural pieces or unusual connection designs.

Artwork, above and below, c/o Thornton Tomasetti. For more, visit www.WhereAreWeGoing.com.

Screen Shot 2015-08-12 at 2.54.07 PM.png
Google+ Google+