AEC Technology Hackathon 2016


Last month I had the pleasure of attending the fourth annual AEC Technology Symposium and Hackathon put on by Thornton Tomasetti's CORE Studio in New York City. The symposium kicked off with many fantastic speakers, I highly recommend checking out the full videos of the presentations over on the TT CORE Studio Youtube playlist. As with last year's symposium, I was personally most impressed with the work presented by Luc Wilson and Mondrian Hsieh demonstrating the use of computational design and custom digital tools for urban planning and visual analysis with Kohn Pedersen Fox's Urban Interface.

This year was also my first ever participation in a hackathon. I registered with the goal of teaming up with technology enthusiasts and individuals from other disciplines to see if I could help develop a solution for some of the pain points frequently encountered during the design and documentation process. My hope was that I could leverage my Dynamo knowledge and experience in frequently uncovering barriers in architectural practice to learn something about coding bespoke applications and user interfaces from those more familiar with the software side of the industry.

Image courtesy of Thornton Tomasetti CORE Studio

Image courtesy of Thornton Tomasetti CORE Studio

The goals of the hackathon were simple...
This event is organized for programmers, web developers, and AEC experts to collaborate on novel ideas and processes for the AEC industry. The focus will be on digital/computational technologies that have been used on projects, the lessons learned from them, and how it impacted the overall project workflows. The Hackathon aims for attendees to learn new skills, generate new ideas and processes for the AEC community through data-driven design and customized applications.

Everyone had approximately 24 hours to assemble teams, formulate an idea, and get to work trying to create a prototype. After the 24 hours, each team was to report out on what they had created and a panel of judges would determine the winners. For the first hour, individuals from the group of 60 or so hackers had a chance to pitch their ideas and attempt to attract a team. Following introductions, everyone mingled and quickly decided which topics they found most interesting and figured out what skill sets were required to fulfill the goals of the project.

The team I joined forces with all were attracted to an idea originally proposed by Timon Hazell:
When continuously exchanging Revit models among constituents on a building project, it is a time-consuming process to track down what changed between versions. In an ideal world, the architects, engineers, or consultants who are sending the updated model will write a summary or list the changes but this rarely actually occurs. Therefore, the traditional approach typically involves a painstaking process of opening both models simultaneously on separate monitors and spotting differences via visual comparison. Is there a better way to see what has changed between two versions of a Revit model and analyze just how much has changed throughout the project?

We ended up with a fantastic team of diverse perspectives to tackle this problem:
Me - represented the architecture side: knowledge of project delivery, recurring challenges, and opportunities for process optimization
Timon - represented the engineering side: spends significant time receiving and interpreting design intent from the architect with little documentation of changes
Charles - represented the consultant side: acoustician with decades of architecture experience, also regularly receives design intent from the architect and must intepret
Matt - represented the software side: experience developing custom digital tools and troubleshooting prepackaged software solutions to enhance AEC production

From left to right: Kyle Martin, Matt Mason, Charles Prettyman, Timon Hazell

From left to right: Kyle Martin, Matt Mason, Charles Prettyman, Timon Hazell

The first step was to define the problem: what are all the factors that constitute a change in a Revit model? After some brainstorming we identified 4 key change types:

  1. Elements added to the model
  2. Elements deleted from the model
  3. Family type or information parameter changes
  4. Geometry changes: location, shape, or size

We set out to create a two-part solution to this problem. First, a C# Revit add-in that essentially acts as a "diff" to compare all Revit elements between two models and generate a list of viewable items. Second, a JSON file and accompanying Dynamo workflow that would produce a data visualization for targeting concentrations of changes throughout the project.

Our trusty C# guru Matt immediately began coding the Revit add-in while the rest of the team created sample Revit models and cartooned out the data visualization component. After many hours of relentless coding, the first add-in prototype was ready to test. With a few rounds troubleshooting we were able to isolate the first list of altered Revit elements and export the first JSON file. The parameters associate with each Revit element contained within the JSON file allowed us to start building a Dynamo definition to restructure and visualize the data using the Mandrill package from Konrad Sobon. By early morning we had a working Revit add-in that mostly accomplished what we were looking for and began working out the kinks in the Dynamo workflow. As time began to evaporate in the final hours, we scrambled to test and troubleshoot the tools, assemble our presentation, and develop documentation. Ultimately we decided on the name Metamorphosis to represent the transformation of Revit models over time and their evolution into thoroughly-coordinated built form.

At the end of the Hackathon, approximately a dozen projects were presented to the judges in 5-minute maximum time allotments. Our team tried our best to efficiently explain the initial idea and walk the crowd through how the tools developed were a viable solution that would be easy to deploy to the average Revit user. After some deliberation, the winners were announced and we were thrilled to find out that we took second place largely in part because of the practicality of the problem we chose and the willingness to share our solution as open source.

And the development didn't stop there...

Following the hackathon, the code was improved in the Revit add-in to fine-tune some of the desired features. In addition the Dynamo definition was cleaned of its hackathon-induced spaghetti and properly labeled. And most importantly, everything was updated and organized into a GitHub repository.

INTRODUCING "METAMORPHOSIS" - An Open Source Revit Change Analysis Tool
Running the model comparison add-in results in a list of Revit elements that can be filtered and re-sorted. Clicking on the categories and individual elements adds them to the selection in the active view and zooms to their location.

List of changed elements sorted By Category (left) or By Change Type (right)

List of changed elements sorted By Category (left) or By Change Type (right)

Clicking the Color Elements button will apply Override Color in View to all elements that fall under 3 change types:

  1. Green - New Elements
  2. Blue - Geometry Change (size or shape)
  3. Red - All Other Changes: modifications to parameters, location, rotation, etc.
The Color Elements feature works any view type: plan, RCP, section, 3D axon, etc.

The Color Elements feature works any view type: plan, RCP, section, 3D axon, etc.

For some of the change types, an Analysis Visualization Framework (AVF) object appears:

  1. A box for an element that has been removed
  2. Arrow(s) for an element that has changed location (in the case of elements like walls, there are two location points so you get two arrows)
  3. A symbol if an element has been rotated

On the Dynamo side, opening the .dyn file and browsing to the exported JSON file will process the accompanying data for visualization in Mandrill. Clicking "Launch Window" in the Report Window node to the far right will open up the interactive data visualization window containing 4 chart types:

  1. Donut Chart (upper left) - number of changes by element type
  2. Stacked Bar Graph (upper right) - number of changes by change type
  3. Bar Graph (lower left) - percentage of items changed vs. total number of items for each category
  4. Parallel Coordinates (lower right) - number of changes for each level, each overlapping line represents a different Revit element category

METAMORPHOSIS VALUE SUMMARY

  • colorize elements in any active view to quickly identify changes, much more efficient than previous methods
  • color by change type allows you to target specific changes
  • sorting, filtering, and element selection in add-in interface allows for quick location and isolation of elements
  • quickly evaluate where most changes are occurring with analytics/visualization, this is particularly useful if the model comes with no documentation
  • compare current state to any previous model, helpful to tell the story of location and amount of changes over time
  • not just a tool for coordinating/viewing changes but making sure you cloud revisions as you go if the drawing set has already been issued


Interested in trying this tool out? Here is where you can access the datasets and learn more:

Github Repository
DevPost page
Presentation slides
Youtube Screen Capture Demonstration


In the end I got exactly what I wanted out of the hackathon experience. I was able to work with three individuals who possessing completely different skill sets than my own. I provided the team with background context and understanding of the problem from an architectural perspective so that we could devise a technological solution. More specifically, Timon and I pushed ourselves to utilize tools that we would not regularly encounter in practice and capitalize on the opportunity to learn the Mandrill package for Dynamo, JSON data formatting, Revit add-in configurations, and establishing a GitHub workflow for sharing and maintaining associated files.

A huge thank you goes out to the Thornton Tomasetti crew who worked so hard to put on such a well-executed event. Thanks to the judges who volunteered their time to hear all of our frenzied, sleep-deprived 5-minute-plus presentations. Lastly, shout out to my teammates who all worked tirelessly to make our idea a reality!

AEC Technology Symposium 2015

AECTS2015_nametag.jpg

AEC Technology Symposium 2015
Hosted by Thornton Tomasetti (NYC)
Baruch College
September 25th, 2015

RESEARCH & DEVELOPMENT IN AEC SESSION 1:
Measurement Moxie
Christopher Connock - Kieran Timberlake

Christopher Connock emphasized the importance of approaching each architecture project as an experiment — an opportunity to test new technology and ideas — a philosophy that Kieran Timberlake incorporates into all of its projects. They are particularly exploring the frontier of data capture using a wireless sensor network to gather building performance analytics. The comparison of plant diversity and placement against soil moisture and temperature sensors in a green roof can help assess drainage, plant health, and solar gain over time. Temperate and relative humidity sensors can be used to investigate an entire space or focus on a particular application such as the performance of materials in a building envelope. Hundreds of different sensors placed throughout a building can track and transmit environmental changes across a given day or even across seasons. Kieran Timberlake implemented a wireless sensor network to help inform their renovation of their new offices in the Ortlieb Bottling Plant in Philadelphia and then developed an in-house app to capture post-occupancy feedback from their own employees about the overall comfort of the space and to identify abnormal conditions. All of this research contributed to the development of tally — a life-cycle assessment (LCA) app and add-in for Revit that evaluates the environmental impact of building materials among design options and promotes a much more eco-conscious approach to design.

Grow Up, Grasshopper!
Andrew Heumann - NBBJ

Andrew Heumann believes in the need to change the perception of design technology in the AEC industry and integrate it more into practice. He showcased an extensive portfolio of projects that have used Grasshopper for Rhino and custom written apps to simulate inner office traffic patterns and the importance of sight lines, the use of human location and city data for urban planning, and the tracking of digital tools in the office to identify focus areas for development and support. All scripts and tools developed in-house at NBBJ are documented and packaged into products for use by project teams. In addition, custom dashboards and user interfaces help reduce intimidation and increase universal adoption — for example, reducing a complex Grasshopper script to a series of slider bars that control the inputs of a parametric design. Andrew also advocated for the use of hackathons and similar hands-on user meeting formats to promote design technology as a facet of culture and process. He shared an example of how a brief hackathon with senior partners at NBBJ led to the funding of a proposal for further development of an innovative tool for optimizing healthcare patient room configurations.

Evolving Modes of R+D in Practice
Stephen Van Dyck and Scott Crawford - LMN Architects / LMN Tech Studio

The Tech Studio was founded to support the prominent role of research and development at LMN Architects in Seattle, which led to an expanded use of analytic and generative tools to drive design. They have not only embraced the use of custom digital tools for the creation and visualization of complex forms but regularly construct scale models for material testing and to explore modular strategies as part of their iterative design process. Working with fabrication in mind facilitates improved precision for collaboration with engineers and consultants. In addition, they have found that a thorough digital process and physical models help better communicate design ideas thus resulting in increased positive community feedback. The development of a ray-tracing tool for exterior acoustics studies, custom panel creation for balance in musical acoustics and aesthetics, and a highly parametric pedestrian bridge spanning a major Seattle highway are a few examples of projects that demonstrate how research is guiding principal for design at LMN.

OPEN-SOURCE DATA AND APPLICATION:
Collaboration and Open Source - How the Software Industry’s Approach to Open Sourcing Non-Core Technology Has Created Innovation
Gareth Price - Ready Set Rocket

This presentation provided insight to the current state of technological innovation through the lens of a digital advertising agency. Gareth Price emphasized that individuals should not be hesitant to share ideas out of fear that another company will benefit from them. Particularly in the AEC industry, companies do not have the overhead to for pay tool creation and requisite support, nor can they cover the cost to pay an outside software consultant. The reality is that other people are busy with their own work and do not have the time nor resources to steal your ideas and commodify them. More importantly, it is advantageous to share ideas for a project because they may elicit constructive criticism, or inspire others to contribute to those ideas and improve them. Also, do not get too entrenched on one idea and know when to pivot — the next great idea may come as an unexpected derivative of the original intention.

Key quotes:
"purpose is the DNA of innovation"
"failure is the new R&D"

How Opens Source Enables Innovation
Mostapha Roudsari and Ana Garcia Puyol - CORE studio / Thornton Tomasetti

Mostapha Roudsari and Ana Garcia Puyol exhibited many examples of digital tools that have emerged out of CORE Studio - the research and development arm of Thornton Tomasetti. The majority of examples presented originated during previous CORE AEC Technology Hackathons and were then further developed into more robust products. Nearly every tool required collaboration from multiple individuals, with expertise in a diverse mix of software platforms, and oftentimes representing different companies. The takeaway from this presentation was the value of open source and hackathons as a means for getting a group of talented people into one room to create new tools for AEC design and representation. Mostapha wanted to make it clear that more important than software tools, code, and machining is the strength and power of the user community. If you want to be at the forefront of the movement, be a developer, however the community is just as important make an effort to share ideas and spread adoption.

Here are some of the many tools presented:

  • vAC3: open source, browser-based 3D model viewer. This project led TT to further develop Spectacles.
  • Spectacles: allows you to export BIM models to a web interface that allows you to orbit in 3D, select layers, and access embedded BIM information (demo HERE)
  • VRX (Virtual Reality eXchange): a method for exporting BIM models for virtual reality viewing via Google Cardboard
  • DynamoSAP: a parametric interface that enables interoperability between SAP2000 (structural analysis and design), Dynamo, and Revit
  • Design Explorer: "an open-source web interface for exploring multi-dimensional design spaces"
  • Pollination: "an open source energy simulation batch generator for quickly searching the parameter space in building design"

For more information, check out TT CORE Studio's GitHub, Projects, and Apps pages

Open Source: Talk 3
Matt Jezyk - Autodesk

Matt Jezyk provided an introduction to Dynamo including its history and the most recent developments. Dynamo may have started as a visual programming add-in for Revit but it is quickly transforming into a powerful tool for migrating data and geometry across numerous software platforms. The talk highlighted the role open source has played in the empowering independent developers to create custom content that expands capabilities and makes interoperability possible. By keeping Dynamo open source, it has benefited from contributions by individuals with a wide range of expertise looking to satisfy specific requirements. As part of a larger lesson taken from the growth of Dynamo, Matt emphasized that the key to the emerging role of technology in practice and the AEC industry as a whole is less about learning specific tools but about codifying a way of thinking — tools are only the implementation of a greater plan.

DATA-DRIVEN DESIGN
Beyond Exchanging Data: Scaling the Design Process
Owen Derby - Flux.io

Flux has been a frequent topic of conversation lately. The company initially marketed a product called Flux Metro which boasted the potential for collecting the construction limitations of any property based off zoning, code, municipal restrictions, and property records — an ideal tool for developers and architects to assess feasibility or use as a starting point for the design process.

The company has since pivoted to focus on creating a pipeline for migrating and hosting large quantities of data for many software formats. Their new product line features an array of plugins for transferring data between Excel, Grasshopper, and Dynamo, with plans to release additional tools to connect to AutoCAD, SketchUp, Revit, 3DS Max and more in the near future. Data exported from these software programs is hosted to a repository in the cloud where it can be archived and organized for design iterations and option investigation. Flux has great potential for achieving seamless interoperability of data and geometry between software platforms, and significantly improving the efficiency of AEC design and production process.

Holly Whyte Meets Big Data: The Quantified Community as Computational Urban Design
Constantine Kontokosta - NYU Center for Urban Science + Progress (CUSP)

The NYU Center for Urban Science + Progress (CUSP) is using research to learn more about the way that cities function. Buildings, parks, and urban plans are all experiments built on assumptions in which the true results don’t emerge until years and decades later. How do you measure the “pulse” of a city? How do macro observables arise from micro behavior? Constantine and CUSP have set out to test these questions by collecting and analyzing: NYC public internet wireless access points, Citi bike share, 311 complaint reporting, biometric fitness devices, and social media. They use these urban data sources to make better decisions and form initiatives for future community improvement projects. The results also have positive implications for city planning, city operations, and resilience preparation.

Data-Driven Design and the Mainstream
Nathan Miller - Proving Ground

Nathan Miller is the founder of the Proving Ground, a technology consultancy for Architecture, Engineering, Construction, and Ownership companies. In his experiences providing training and technological solutions he professes the importance of equipping staff with the right tools and knowledge to adequately approach projects. There is an intersection between managers and leaders responsible for projects and staffing, and those who are actually doing the work. It is imperative to focus on outcomes and not get deterred by the process.

The Biggest IoT Opportunity In Buildings Is Closer Than You Think
Josh Wentz - Lucid

Energy consumption, mechanical systems data, thermal retention, and other metrics are not recorded for the majority of buildings worldwide. These are incredible missed opportunities for evaluating the overall performance of a building and collecting real-time research that can inform better construction techniques. Lucid has developed a product called BuildingOS that offers 170 hardware integration options to collect robust building data. This data has the potential for helping facilities management departments better track efficiency and maintenance of their systems, in addition to contributing to the international pool of data to help us better understand how materials and systems perform over time.

RESEARCH & DEVELOPMENT IN AEC SESSION 1
Capturing Building Data - From 3D Scanning to Performance Prediction
Dan Reynolds and Justin Nardone - CORE studio / Thornton Tomasetti

This presentation highlighted CORE Studio's use of various technologies for capturing existing conditions data and testing architectural responses through computation. They have utilized drones for capturing the condition of damaged buildings and structures by assembling fly-by photos into a point cloud.The development of in-house GPS sensor technology accurate to 1 centimeter anywhere in the world has enabled measuring the built environment and construction assemblies to a high level of precision. CORE Studio has also investigated the use of machine learning for exploring all possible combinations of building design parameters and calculating embodied energy predictions. All of these design technology advancements are helping Thornton Tomasetti design more accurate better-informed systems.

Data-Driven Design
Luc Wilson - Kohn Pedersen Fox Associates PC

Luc Wilson thinks of data as an “Urban MRI” - a diagnostic tool for measuring the existing configuration of cities and predicting future growth. Multiple FAR and urban density studies were presented that exhibited how comparison to precedents and benchmarks helps to conceptualize the data and make visual sense of the analysis. The key to prediction is the ability to test thousands of designs quickly, which Luc has perfected by developing digital tools for quickly computing all possible combinations of input parameters and producing measurable outcomes for comparison. One of the most exciting portions of the presentation was the mention of a 3D urban analysis tool called Urbane, which Kohn Pederson Fox is working with NYU to develop — could this be the new replacement for Flux Metro?

Cellular Fabrication of Building Scale Assemblies Using Freeform Additive Manufacturing
Platt Boyd - Branch Technology

Platt Boyd founded Branch Technology after realizing the potential for 3D printing at a large scale by imitating structures found in nature. Branch has dubbed their technique “cellular fabrication” where economical material is extruded with geometric complexity to construct wall panels that are lightweight and easy to transport. Their process seeks to reduce the thickness required by traditional 3D printing technologies and the intricate geometric structure provides equivalent strength to that of a printed solid. The wall panels are printed free-form with a robotic arm on a linear track and then are installed onsite where insulation, sheathing, and finish material are added to reflect the same condition as traditional wood or metal stud construction. It will be really interesting to see Branch continue to refine their methods and start to tackle complex wall conditions for use in real-life building projects in the near future.


Watch videos of all the presentations HERE.