DynamoDC - Precision Inquisition

Last week I had the pleasure of presenting [remotely] to the DynamoDC user group. Timon Hazell reached out to ask if I would be willing, and since he had so generously served as our guest lecture for the Dynamo-litia September 2016 meeting, I was more than happy to return the favor. Apparently DynamoDC has previously hosted several intro to Dynamo workshops so he asked if I could demonstrate a more advanced example of how Dynamo can be used to extend Revit functionality.

One of my favorite things about AEC community outreach through user groups, conferences, and hackathons are that I am exposed really interesting problems that I would not have encountered in my own work. At the beyondAEC Hackathon the week before, one of the participants had asked me if there was a way to use Dynamo to isolate the exterior facade material areas and types specifically corresponding to a room in the building. I figured DynamoDC would be the perfect opportunity to tackle this workflow and the PRECISION INQUISITION: Advanced Extraction of Revit Model Information Using Dynamo presentation was born...

Image 1_DynamoDC_precision inquisition.jpg

Revit models are powerful repositories of geometric, numeric, and descriptive building information, however the default tools for accessing that information are often limited and cumbersome. Recent questions have been raised about utilizing Dynamo to execute precise tasks such as performing quantity takeoffs on specific portions of an exterior facade, comparing vision glass to room area, or even evaluating the proportion of total facade area by building orientation. Special guest Kyle Martin will deliver a [remote] live demonstration of advanced model analysis approaches with Dynamo. Topics covered will include: visual programming principles, general logic, list management, list at level, filtering and sorting, index tracking, querying Revit parameters, geometric properties, color for clarity, and much more.

In the hour of available presentation time I hoped to cover the following ambitious list of concepts:

  • basic visual programming principles
  • general logic
  • list management
  • list at level
  • filtering and sorting
  • index tracking
  • querying Revit parameters
  • geometric properties
  • color for clarity

As I prepared for the presentation, the Dynamo workflow grew increasingly complex.

The principal function of the Dynamo definition was to query exterior wall geometry from the model based on a room number and perform material takeoffs, directional composition, and visual analysis.

OBJECTIVES: Target specific rooms in the Revit model by Room Number, isolate the exterior Wall/Window elements specific to that room, calculate total area of exterior facade for each Room, understand composition of vision to solid materials, and ass…

OBJECTIVES: Target specific rooms in the Revit model by Room Number, isolate the exterior Wall/Window elements specific to that room, calculate total area of exterior facade for each Room, understand composition of vision to solid materials, and assist with code calculations such as light & ventilation.

OBJECTIVES: Query all exterior Wall elements, use the underlying geometry to determine direction of each wall, sort walls by cardinal directions or bespoke orientation system, and calculate proportion of facade areas for each direction.

OBJECTIVES: Query all exterior Wall elements, use the underlying geometry to determine direction of each wall, sort walls by cardinal directions or bespoke orientation system, and calculate proportion of facade areas for each direction.

OBJECTIVE: Color specific items for analysis, visual clarity, and storytelling

OBJECTIVE: Color specific items for analysis, visual clarity, and storytelling

The live demonstration was recorded for your viewing pleasure. You may notice the video starts a little late due to technical difficulties but no significant content was missed. Presentation slides and the Dynamo file can be accessed HERE.

Even with a rush towards the end, I was able to successfully make it through all the content. I appreciate the audience being super receptive and patient given the hands-off format. And a very special thank you Timon Hazell, John Schippers, and Dana De Fillippi for the opportunity.

DynamoDC audience

DynamoDC audience

Kyle's "home studio" setup

Kyle's "home studio" setup

The other day I received a surprise t-shirt in the mail as a thank-you, I will wear it with pride!

Image 5_DynamoDC t-shirt.jpg

Journey to the West Coast - ACBD2017 & SFDUG

Last week I was in San Francisco for the inaugural Advancing Computational Building Design conference featuring two days of speakers, panels, and discussions centered around the growing importance of technology in architecture, engineering, and construction. The conference placed a particular emphasis on computational design -- the use of coding, visual programming, data analytics, and other methods for more informed design and implementation.

I had the pleasure of speaking on a panel titled "Reimagining the Culture & Contractual Relationships Between Owners, Architects, & Contractors to Enable Further Adoption of Computational Design" alongside Aubrey Tucker, Innovative Technology Developer at Stantec and Thomas Whisker, VDC Project Manager at Turner Construction. While my colleagues focused on the intricacies of contracts, BIM disclaimers, and model fidelity, I took the opportunity to share how Tocci Building Co. is approaching projects differently. As Owner's Project Manager on several projects, we have the unique authority to bring the Virtual Design and Construction (VDC) team in earlier on the project for design facilitation and a simultaneous quantity takeoff and pricing process. At Tocci, computational design further enhances what we are able to do.

Panel from left to right: Kyle Martin, Aubrey Tucker, Thomas Whisker. Moderator: Jorge Barrero. Photo credit: Ryan Cameron.

Panel from left to right: Kyle Martin, Aubrey Tucker, Thomas Whisker. Moderator: Jorge Barrero. Photo credit: Ryan Cameron.

With more traditional project delivery VDC implements vigorous MEP/FP coordination, resulting in confirming RFIs for near-instantaneous response time and less tedious paperwork. A major advantage in a collaborative approach to project delivery/coordination is that potential conflicts are identified earlier in the project. Resolving issues before they are time sensitive and at a time when the owner can stll make decisions about the cost of the project -- rather than have those decisions made for them by construction schedule -- will ultimately deliver a final product much more closer to the initial design intent.

The key takeaway from ACBD for me was the overwhelming consensus that technology is crucial to the future success of the AEC industry. I attend several conferences per year and never have I encountered a more unified, passionate crowd of interdisciplinary professionals who are consistently pushing the boundaries of possibility and setting an example of true innovation through their work.


SFDUG_8-2017_Cost inTranslation.jpg

Later that evening I delivered one of two featured presentations to the San Francisco Dynamo User Group celebrating their 2-year Anniversary at AIA San Francisco. The SFDUG was formed with the same purpose as the Dynamo-litia Boston User Group I founded at the Boston Society of Architects to help educate and promote the use of Dynamo visual programming in local AEC communities. My presentation "Cost in Translation: Bridging the Gap Between Designers & Contractors" highlighted some of my efforts at Tocci including several day-to-day implementations of Dynamo and longer term projects such as the Sasaki WinterLight Pavilion and Union Point Comfort Station. The crowd at this event was super receptive and it was fun to meet so many new faces from the opposite side of the country!

Image credit: Ryan Cameron

Image credit: Ryan Cameron

Unfortunately there was an error recording my portion of the presentation, however you can view the slide deck HERE.

Make sure to check out the other presentation Computational Design Increases Value to Project Managers & Designers | How to gain Buy-In by colleague Ryan Cameron, Architect at DLR Group.

In the end it was a busy couple of days on the West Coast but I thoroughly enjoyed the opportunity to exchange knowledge and promote the importance of computational design and data-driven processes in the future of the AEC industry.

Hands-on Prototyping for BUILDing Forward

Read about the unique opportunity for geometry analysis, fabrication, and the resulting gallery installation as initially reported on the Tocci Blog...

Image Credit: Jamie Farrell

Image Credit: Jamie Farrell

On July 27th, an opening reception was held for Autodesk’s BUILDing Forward exhibit at the Boston Society of Architects. This exhibit celebrates digital craft in the greater Boston community and highlights the research projects made possible by the Autodesk BUILDSpace — a state-of-the-art research and development facility in the Design Center.

Tocci partnered with Sasaki Associates to research and develop a prototype called WinterLight, a proposal for a temporary winter pavilion for the Rose Kennedy Greenway. Currently in the early design phase, WinterLight is a warming hut designed to encourage activation of the city’s public realm during the winter months. The structure is a semi-dome with strategic openings in customized masonry blocks, designed to shield visitors from winter winds while they enjoy the warmth of an interior fire pit. The final location of the pavilion will be located in Boston: the site is to be determined.

Image Credit: Lucca Townsend, Sasaki Associates

Image Credit: Lucca Townsend, Sasaki Associates

This project required extensive computational design from Sasaki staff to strike a balance between desired aesthetic and regularity of the blocks. Tocci’s role was to assist with geometry analysis and support the design process through construction feasibility studies. With each new design iteration, we utilized Dynamo Studio to extract total pavilion dimensions, overall block quantities, block sizes, repeatable types, total volume, total weight, and other metrics.

Image Credit: Lucca Townsend, Sasaki Associates

Image Credit: Lucca Townsend, Sasaki Associates

image 2b - color analysis & block types.jpg

The BUILDing Forward exhibit provided the perfect opportunity to experiment with fabrication methods and materials for producing the unique geometry of the blocks. Sasaki chose a section of nine blocks comprised of five unique types from the overall pavilion to demonstrate scale and geometric variation. They first generated a digital model of the composition, and then processed the individual shapes into toolpaths for cutting profiles from Medium Density Fiberboard (MDF) using a Computer Numeric Control (CNC) three-axis router. They continued to cut an ingenious system of holes into the MDF sheets, lining up each piece using threaded rod. This created a negative form of each block shape for pouring concrete. Each concrete form also incorporated removable sections and a hole at the top for concrete. At this time, they sanded and coated the interior surfaces of the forms with an epoxy sealer to facilitate the release of concrete.

image 3 - molds disassembled.jpg

To prep for pouring, we disassembled the forms to coat the interior surfaces with form release. They were then reassembled on the threaded rod guides and tightened using nuts and washers.

image 4 - molds coated.jpg

We blended Portland cement and sand silica to create a concrete mixture that could support the compressed weight of the stacked blocks and maintain a smooth, gallery-quality finish. As each five-gallon-bucketful of concrete was poured through the top, a team tapped the sides of the forms, agitating the mixture and forcing trapped air bubbles to the surface.

Image Credit: Christine Dunn, Sasaki Associates

Image Credit: Christine Dunn, Sasaki Associates

At times, the form release did not properly work, forcing us to pry the blocks from their forms.

image 6 - block releasing.jpg

Rotating shifts of the Sasaki-Tocci team spent a week to producing the prototype, as each block required 24 hours to cure. With one last round of chiseling and sanding, all nine blocks were ready for their BSA Space debut.

image 7 - blocks stacked.jpg

The opening reception was well attended. It was inspiring to see so many creative projects coming out of the BUILDSpace and local AEC community. BUILDing Forward will be on display at the BSA until October 5th, 2017 if you would like to see our work and all the other excellent projects.

Stay tuned for more as the WinterLight project evolves into a full-scale realization.

Check out this Sasaki blog post about the BUILDing Forward event for even more information.

Facades+ Boston - Visual Programming with Dynamo Workshop

Originally featured on the Tocci Blog, I recap my recent experience co-leading a Dynamo workshop at this year's Facades+ one-day conference in Boston...

Last week I attended and presented at this year’s Facades+ Boston event — a one-day symposium and trade show focused on the importance of high performance envelope design in the AEC Industry.

SYMPOSIUM:

The first half of the day featured three engaging panel discussions.

Panel 1 – Expanding the Envelope: Generating Urban Data for Responsive Design:
This group of panelists urged the importance of data, tech innovation, and digital equity in the Boston built environment. Capturing data for many city metrics helps reveal trends and provide insight for a prosperous and safer tomorrow.

Panel 2 – Modernist Performance Retrofits:
One presentation in architectural detailing for a historical retrofit project provided an intriguing contrast against a second presentation about examining materials and fenestration details to identify high-performing wall assemblies at different price-points. While one project carefully considered the aesthetic ramifications of their intervention, the other team thoroughly emphasized performance.

Panel 3 – Making Space for Bostonians:

Place-making is an essential consideration in urban design and these three panelists discussed the role that strategic programming, structural and material innovation, and inviting public space has played in creating thriving districts in the city of Boston.

VISUAL PROGRAMMING WITH DYNAMO WORKSHOP:

Colin McCrone and I led an afternoon workshop that demonstrated the usefulness of visual programming for Revit in facade design and analysis workflows. Our workshop kicked off with an introduction to the concepts of computational design, migration across various software platforms, and examples of how the tools are being used in the industry today.

After providing an overview of the interface, terminology, functions, and features, the first exercise tackled one of Revit’s most temperamental elements – curtain wall. Modifying curtain wall requires many sequential clicks to adjust overall size, mullion spacing, and exchanging pinned panels, mullions, and doors. Dynamo provides the capability to query information from the model, target specific items, and batch alter them as needed. The accompaniment of math and logic adds further analysis and opportunities for customization to the process.

TBC blog_KM Facades+_1.jpg

For the final portion of the workshop, we highlighted three panelization processes that demonstrate the geometric design potential of Dynamo. The first used pixel brightness from an image to swap out panels by color and generate a mosaic interpretation.

The second read point coordinate data from an excel spreadsheet to place 4-point adaptive curtain wall panels in a curvilinear wall configuration.

TBC blog_KM Facades+_3.jpg

Lastly, the third utilized the Revit Sun Path tool to analyze solar gain on each panel of a wall surface and colorize the panels from least to most exposure.

TBC blog_KM Facades+_4.jpg

Overall the experience was a huge success and we both thoroughly enjoyed sharing our knowledge with the 20 or so members of the Boston AEC community who attended our workshop.

If you would like to learn more about Dynamo, the June gathering of the Dynamo-litia Boston user group will feature a brief recap of the Facades+ workshop and more in-depth presentations of how Dynamo is being used in practice.

More about Facades+ Boston

AEC Technology Hackathon 2016


Last month I had the pleasure of attending the fourth annual AEC Technology Symposium and Hackathon put on by Thornton Tomasetti's CORE Studio in New York City. The symposium kicked off with many fantastic speakers, I highly recommend checking out the full videos of the presentations over on the TT CORE Studio Youtube playlist. As with last year's symposium, I was personally most impressed with the work presented by Luc Wilson and Mondrian Hsieh demonstrating the use of computational design and custom digital tools for urban planning and visual analysis with Kohn Pedersen Fox's Urban Interface.

This year was also my first ever participation in a hackathon. I registered with the goal of teaming up with technology enthusiasts and individuals from other disciplines to see if I could help develop a solution for some of the pain points frequently encountered during the design and documentation process. My hope was that I could leverage my Dynamo knowledge and experience in frequently uncovering barriers in architectural practice to learn something about coding bespoke applications and user interfaces from those more familiar with the software side of the industry.

Image courtesy of Thornton Tomasetti CORE Studio

Image courtesy of Thornton Tomasetti CORE Studio

The goals of the hackathon were simple...
This event is organized for programmers, web developers, and AEC experts to collaborate on novel ideas and processes for the AEC industry. The focus will be on digital/computational technologies that have been used on projects, the lessons learned from them, and how it impacted the overall project workflows. The Hackathon aims for attendees to learn new skills, generate new ideas and processes for the AEC community through data-driven design and customized applications.

Everyone had approximately 24 hours to assemble teams, formulate an idea, and get to work trying to create a prototype. After the 24 hours, each team was to report out on what they had created and a panel of judges would determine the winners. For the first hour, individuals from the group of 60 or so hackers had a chance to pitch their ideas and attempt to attract a team. Following introductions, everyone mingled and quickly decided which topics they found most interesting and figured out what skill sets were required to fulfill the goals of the project.

The team I joined forces with all were attracted to an idea originally proposed by Timon Hazell:
When continuously exchanging Revit models among constituents on a building project, it is a time-consuming process to track down what changed between versions. In an ideal world, the architects, engineers, or consultants who are sending the updated model will write a summary or list the changes but this rarely actually occurs. Therefore, the traditional approach typically involves a painstaking process of opening both models simultaneously on separate monitors and spotting differences via visual comparison. Is there a better way to see what has changed between two versions of a Revit model and analyze just how much has changed throughout the project?

We ended up with a fantastic team of diverse perspectives to tackle this problem:
Me - represented the architecture side: knowledge of project delivery, recurring challenges, and opportunities for process optimization
Timon - represented the engineering side: spends significant time receiving and interpreting design intent from the architect with little documentation of changes
Charles - represented the consultant side: acoustician with decades of architecture experience, also regularly receives design intent from the architect and must intepret
Matt - represented the software side: experience developing custom digital tools and troubleshooting prepackaged software solutions to enhance AEC production

From left to right: Kyle Martin, Matt Mason, Charles Prettyman, Timon Hazell

From left to right: Kyle Martin, Matt Mason, Charles Prettyman, Timon Hazell

The first step was to define the problem: what are all the factors that constitute a change in a Revit model? After some brainstorming we identified 4 key change types:

  1. Elements added to the model
  2. Elements deleted from the model
  3. Family type or information parameter changes
  4. Geometry changes: location, shape, or size

We set out to create a two-part solution to this problem. First, a C# Revit add-in that essentially acts as a "diff" to compare all Revit elements between two models and generate a list of viewable items. Second, a JSON file and accompanying Dynamo workflow that would produce a data visualization for targeting concentrations of changes throughout the project.

Our trusty C# guru Matt immediately began coding the Revit add-in while the rest of the team created sample Revit models and cartooned out the data visualization component. After many hours of relentless coding, the first add-in prototype was ready to test. With a few rounds troubleshooting we were able to isolate the first list of altered Revit elements and export the first JSON file. The parameters associate with each Revit element contained within the JSON file allowed us to start building a Dynamo definition to restructure and visualize the data using the Mandrill package from Konrad Sobon. By early morning we had a working Revit add-in that mostly accomplished what we were looking for and began working out the kinks in the Dynamo workflow. As time began to evaporate in the final hours, we scrambled to test and troubleshoot the tools, assemble our presentation, and develop documentation. Ultimately we decided on the name Metamorphosis to represent the transformation of Revit models over time and their evolution into thoroughly-coordinated built form.

At the end of the Hackathon, approximately a dozen projects were presented to the judges in 5-minute maximum time allotments. Our team tried our best to efficiently explain the initial idea and walk the crowd through how the tools developed were a viable solution that would be easy to deploy to the average Revit user. After some deliberation, the winners were announced and we were thrilled to find out that we took second place largely in part because of the practicality of the problem we chose and the willingness to share our solution as open source.

And the development didn't stop there...

Following the hackathon, the code was improved in the Revit add-in to fine-tune some of the desired features. In addition the Dynamo definition was cleaned of its hackathon-induced spaghetti and properly labeled. And most importantly, everything was updated and organized into a GitHub repository.

INTRODUCING "METAMORPHOSIS" - An Open Source Revit Change Analysis Tool
Running the model comparison add-in results in a list of Revit elements that can be filtered and re-sorted. Clicking on the categories and individual elements adds them to the selection in the active view and zooms to their location.

List of changed elements sorted By Category (left) or By Change Type (right)

List of changed elements sorted By Category (left) or By Change Type (right)

Clicking the Color Elements button will apply Override Color in View to all elements that fall under 3 change types:

  1. Green - New Elements
  2. Blue - Geometry Change (size or shape)
  3. Red - All Other Changes: modifications to parameters, location, rotation, etc.
The Color Elements feature works any view type: plan, RCP, section, 3D axon, etc.

The Color Elements feature works any view type: plan, RCP, section, 3D axon, etc.

For some of the change types, an Analysis Visualization Framework (AVF) object appears:

  1. A box for an element that has been removed
  2. Arrow(s) for an element that has changed location (in the case of elements like walls, there are two location points so you get two arrows)
  3. A symbol if an element has been rotated

On the Dynamo side, opening the .dyn file and browsing to the exported JSON file will process the accompanying data for visualization in Mandrill. Clicking "Launch Window" in the Report Window node to the far right will open up the interactive data visualization window containing 4 chart types:

  1. Donut Chart (upper left) - number of changes by element type
  2. Stacked Bar Graph (upper right) - number of changes by change type
  3. Bar Graph (lower left) - percentage of items changed vs. total number of items for each category
  4. Parallel Coordinates (lower right) - number of changes for each level, each overlapping line represents a different Revit element category

METAMORPHOSIS VALUE SUMMARY

  • colorize elements in any active view to quickly identify changes, much more efficient than previous methods
  • color by change type allows you to target specific changes
  • sorting, filtering, and element selection in add-in interface allows for quick location and isolation of elements
  • quickly evaluate where most changes are occurring with analytics/visualization, this is particularly useful if the model comes with no documentation
  • compare current state to any previous model, helpful to tell the story of location and amount of changes over time
  • not just a tool for coordinating/viewing changes but making sure you cloud revisions as you go if the drawing set has already been issued


Interested in trying this tool out? Here is where you can access the datasets and learn more:

Github Repository
DevPost page
Presentation slides
Youtube Screen Capture Demonstration


In the end I got exactly what I wanted out of the hackathon experience. I was able to work with three individuals who possessing completely different skill sets than my own. I provided the team with background context and understanding of the problem from an architectural perspective so that we could devise a technological solution. More specifically, Timon and I pushed ourselves to utilize tools that we would not regularly encounter in practice and capitalize on the opportunity to learn the Mandrill package for Dynamo, JSON data formatting, Revit add-in configurations, and establishing a GitHub workflow for sharing and maintaining associated files.

A huge thank you goes out to the Thornton Tomasetti crew who worked so hard to put on such a well-executed event. Thanks to the judges who volunteered their time to hear all of our frenzied, sleep-deprived 5-minute-plus presentations. Lastly, shout out to my teammates who all worked tirelessly to make our idea a reality!

Dynamo-litia Boston - October 2016


This second Dynamo-litia workshop featured a live demonstration of modifying Revit parameters using Dynamo.

Workshop Description:
Back by popular demand! This session will showcase several practical workflows for everyday Revit production. If you are still wondering how Dynamo applies to the work regularly performed in architecture firms, this is the perfect chance to find out. The majority of the meeting will be devoted to a live demonstration and attendees will be encouraged to follow along. No prior Dynamo experience necessary; users of all levels welcome.

When: October 20, 2016
Where: BSA Space - Boston

More information at the Boston Society of Architects.

Apologies for the abrupt ending. The battery on the recording device died right before resolving and fully explaining the Element.SetParameter function but this video contains 99% of the relevant content. Presentation slides and datasets can be downloaded HERE.

Ceiling Alignment Made Easy(ier) with Dynamo


Rectangular ACT ceiling grid alignment is a task that frequently occurs during the design and documentation of architecture projects in Revit. After a ceiling has been placed, the typical approach involves creating a dimension string between two parallel walls and one of the gridlines in the ACT ceiling, selecting the dimension, and clicking the EQ symbol that automatically centers that gridline between the surrounding walls. This process must then be repeated for the perpendicular orientation.

For a more efficient workflow using Dynamo:

  • create a new Generic Model family
  • in the family editor, go to the Manage tab > Object Styles
  • in the Object Styles menu under Model Objects, click New under Modify Subcategories
  • name the new object DYNAMO and change the color to something bright that will be easily identified in the RCPs
  • in elevation, create a new reference plane and connect a dimension string between this and the Ref. Level
  • in plan, draw a rectangle the same size as one ACT tile (1’x1’, 2’x4’, etc.)
  • also draw a line at the midpoint of each direction to determine the center point of one tile
  • make sure that the lines are assigned to the upper reference plane so that they will be positioned near to the ceiling
  • in the Family Types dialogue, create a new parameter for Offset Height — this will be assigned to the dimension string in elevation and will determine the offset distance from the floor to the ceiling. It can be a Type parameter (same for EVERY instance of the family) or an Instance parameter, which would place at a default height and then allow adjustments for ceiling height variation in the project.

Not all projects are perfectly orthogonal. In some cases, there may be a defined angular shift in portions of the building, if not many unique angles. A secondary group of lines could be copied and pasted to the same place then assigned On/Off visibility parameters for an orthogonal and angled variation of the family. An instance parameter for the angle would allow for a custom rotation of up to 90 degrees on every instance in the project.

Once your family is completed, save and load into the Revit model. A Dynamo definition can then be built that targets ceiling elements in the model, queries their center point, and places the alignment family.

Ceiling Alignment - Dynamo Definition (hi-res image available HERE)

Ceiling Alignment - Dynamo Definition (hi-res image available HERE)

I chose to specifically isolate ceilings by Type and also by Level. This helps cut down on the requisite computation power and time that it takes the task to run. Another advantage is being able to open an RCP view, watch the families instantiate, and verify that everything has been configured correctly in Dynamo.

After the alignment families have been placed, users on the team can begin the task of manually aligning the ceiling grids to the red box. If it is determined that there is not sufficient space between the gridline nearest to the perimeter walls, the centerline crosshair of the family can be used instead to perfectly center the ceiling grid in the room instead.

Because the lines in the alignment family have been created with the name DYNAMO under the Generic Models category, it is easy to turn off their visibility through the project for final documentation via the View Template. Additionally, if ceiling have shifted and the positioning of the families becomes obsolete over time, it is easy to select one instance, right-click and select all in the view or project, then delete them entirely.

Given a minimum of 5 mouse clicks for the traditional process, selecting the Align tool (or typing the AL hotkey), picking a line on the alignment family, and then a gridline requires a few less clicks. Multiplied over dozens or even hundreds of ceilings throughout the project, this approach is vastly more efficient and removes the need for extra decision making.

One must still manually account for the minimum distance between the perimeter walls and the grid. And obviously this approach is not ideal on L-shaped and irregular ceiling profiles. However, the majority of ceilings in projects are rectangles and Dynamo can help production staff quickly work through an entire RCP full of ceilings so they can quickly apply their time to other pressing matters.

Dynamo-litia Boston Turns 1!

This week the Dynamo-litia Boston celebrated it's One Year Anniversary. To celebrate, I used Dynamo to generate a virtual birthday cake.

Here are some of the highlights of the first year:

First session: September 21, 2015

7 Presentations:

  1. Introduction to Dynamo
  2. Dynamo for Production
  3. Dynamo and the Evolution of BIM
  4. Dynamo for All
  5. Dynamo and the Zen of Data Flow
  6. Work Smarter Not Harder
  7. Bringing Engineers & Architects Together Through Digital Design

1 Workshop:
Revit parameter export
Panelized surface & analysis


Did you know there is an entire Vimeo album devoted to the Dynamo-litia?

Dynamo-litia Boston Album

6 videos
1,777 Plays
51 Finishes
Average time per view: 34m,06s

Top 10 Countries:
US, UK, Spain, Canada, Brazil, Australia, Netherlands, Italy, Singapore, Germany

Lastly, this year would not have been possible without the contributions of many. Special thanks to:

Boston Society of Architects:
Conor MacDonald
Sara Garber
Revit User Group

Autodesk - Dynamo Team

Shepley Bulfinch

Speakers:
Zach Kron - Autodesk
Kevin Tracy - NBBJ
Christina Tully – Shepley Bulfinch
Masha Pekurovsky – Perkins Eastman
Eric Rudisaile - Microdesk
Timon Hazell - Silman

Most importantly, the Boston AEC Community! Looking forward to future sharing and collaboration.

Dynamo-litia Boston - September 2016

This installment of Dynamo-litia featured Timon Hazell, Sr. BIM Engineer at Silman (Washington DC).

Bringing Engineers and Architects Together Through Digital Design
Design changes that took weeks to coordinate are now happening in hours. We are now able to create new iterations of complex designs in seconds. This speed has its benefits, but it also adds complexity to current collaboration practices. How can we work better as a single design team? How can we use conceptual abstract models to generate documentation models? How can we model non-planar framing directly in Revit? You know the answer to many of these involves Dynamo! Join us as Timon Hazell from Silman shares his experiences and talks through a few case studies using Revit, Rhino, Dynamo and Grasshopper.

When: September 22, 2016
Where: Shepley Bulfinch - Boston

More information at the Boston Society of Architects.

Due to A/V difficulties, a few portions of the presentation did not make the video. To follow along AND see upcoming announcements, make sure to download the presentation slides HERE.

Automated Room Placement From Existing Drawings

Sometimes the only resource for existing conditions on a project are scans of original architectural drawings, often produced decades earlier. Scanning the drawings converts them into a digital form but these are flattened images from which no smart information can be extracted. Tracing walls, stair locations, and other building elements on top of a linked image underlay is relatively easy in Revit. Rooms however present a more difficult challenge because they are only represented by a text label and therefore many rooms may occupy the same open space. For example, a corridor may contain several appendages or alcoves that do not have physical elements separating them. This task becomes much more time consuming when placing a few hundred rooms across several levels of an existing building, which I was recently tasked with.

I began this investigation by converting the PDF scans into JPG files and opening them in Photoshop where I could quickly isolate the room names only. Once isolated, use a combination of the Gaussian blur tool, inverse select, and black color fill to convert the room name locations into a larger black blob then export each floor as a new JPG.

In Dynamo import the JPG containing the black blobs and scan the image for black pixels using the Image.Pixels and Color.Brightness nodes. You may have to try out several pixel values — using a large number in the xSamples input will generate too large of a pixel array and will take a long time to run, using too small a number may cause the node to miss some of the block blobs.

Rooms from Existing - Dynamo Definition (hi-res image available HERE)

Rooms from Existing - Dynamo Definition (hi-res image available HERE)

The Color.Brightness node returns a list of values between 0 and 1 that correspond to the brightness found in each pixel. Using some list management techniques, the list is inverted and then all white pixels (0s) are filtered out and only the darkest values (largest) are used to isolate the points where text values are located on the existing plan. Circles are created at all of the remaining points, with the radius defined by the corresponding darkness values. The entire list of circles should be matched against itself to group all intersecting circles because the Color.Brightness node may have read multiple block pixels in each text blob. Then extrude all the circles as solids, intersect any joining geometry, and use the Solid.Centroid node to determine the center point of each solid, which in theory should be the location point of each text label on the existing plans. A Count node can be used to evaluate the resulting number of text location points and determine if the total count of black blobs read in Dynamo closely matches the total number of room names labeled in the PDF scan.

If the counts are significantly different, adjust the pixel value sliders in the beginning of the definition to create a more or less dense field of points and re-trace the process up to this point. If the counts are close, the next step is to query the traced existing conditions walls from Revit model for the corresponding level. This can be done using combination of GetAllElementsOfCategory with the Category walls and then filtering only the elements on the same level.

Once the walls appear in Dynamo, you will most likely see that the resulting points from the existing plans will not be at the right scale, orientation, and location as the Revit walls. Placing a Geometry.Scale and Geometry.Node between the list of solids and the Solid.Centroid node will allow you to experiment with various scale and rotation values prior to the creation of final points. After some trial and error and visual approximation, you should be able to scale and orient the cluster of points to a configuration that matches the scale of the Revit model -- each point should look like it lands in the center point of each room.

Even after scaling and rotating the points, they may still be located off to the side of the Revit walls. To coordinate locations between Revit and Dynamo, begin by going to Manage > Setup > Line Style in Revit and create a new line called Dynamo. In the floor plan view of the level you are working on, pick a prominent element (such as a corner of the building) and draw a model line and change it to the newly created Dynamo line style. Back in Dynamo, use the Select Model Lines by Style node from the Archi-Lab package to locate the "Dynamo" line in Revit. Get the location at the start of that line using the Curve.Start node and also the location of origin in Dynamo with the Point.Origin node. A vector can now be established between these two points and used to translate all of the room locations scanned from the JPG to the same location as Revit. Note that moving from the Dynamo origin to the line drawn at the Level in plan should also move the room points to the correct elevation.

Once everything appears to line up correctly, the Tool.CreateRoomAtPointAndLevel node from the Steam Nodes package will place rooms at each point in the Revit model. For open areas such as corridors, multiple rooms will now be overlapping, potentially causing Warnings. The last step is to go through the model and draw room separation lines at logical points where divisions should occur between the room elements.

After every room has been separated in Revit, the process of populating Room Names, Numbers and other parameters is a manual one. However, if you are lucky enough to possess a CAD file for the existing conditions, Dynamo can also be used to populate the newly-created Room elements with text parameters. Begin by importing the CAD file into the Revit model and move it to the correct location. As a best practice, I tend to delete and purge all unnecessary layers in the CAD file prior to import -- in this case you may save a one-off of room text only and import that instead. In Revit if you explode an imported CAD drawing, the text will become actual Revit text objects (learn more HERE). With Dynamo, all text objects can be grouped into clusters based on shared location, matched up with the Room element center points, and then populate the associated elements with parameter information: Name, Number, Space Type, etc. Due to incongruent alignment or the use of leader lines, this workflow will most likely not work for every text item from the CAD plan but it may alleviate a large portion of the manual data entry required to populate the Revit room elements. More about this process in a future post...

Although a Dynamo-based approach requires some trial and error, it allows you to quickly place a large quantity of Revit Room elements in the exact same location as a scanned drawing. Knowing that the room locations are correct allows for quicker naming and parameter manipulation using Dynamo or other means and reduces a portion of monotonous work.

RTCNA2016 Recap - "Computational Design for the 99%"


Several weeks ago I had the privilege of presenting at Revit Technology Conference – North America 2016. My presentation frequently repeated the phrase “Because Nobody Went to Architecture School to…” We have all been there at some point in our career – continuously repeating the same manual alteration to a Revit model, changing parameter information one click-at-a-time, or performing tedious data entry for hours on end – these are the moments when you wonder if the practice of architecture is not exactly what you dreamed about in architecture school. For all the advancements that BIM has introduced to the AEC industry, production validation, and maintaining uniformity of the information are still difficult undertakings. Tasks that require hours and days of individual modifications are not always professionally rewarding and monopolize time that can better be spent on the overall quality of the design and documentation. I often tell colleagues that if you find yourself asking the question, “There has to be a more efficient way to do this”, chances are good that Dynamo can help.

I did not come from a computer programming background but instead began teaching myself Dynamo to address specific problems frequently encountered in Revit. After achieving a basic understanding of how Dynamo works, I was able to investigate tasks of increasing complexity that began with simple changes to the model and evolved to automating entire processes. As my Dynamo experience continued to grow I began exploring ways that Revit could interact with other software platforms and how data could be manipulated and visualized. My skillset eventually evolved to where I understood more advanced concepts of geometry and parametricism for design but this was all built on the foundational knowledge acquired from researching daily production tasks.


REVIT MODEL ANALYSIS
In my presentation I preceded to share a sample of workflows that respond to specific challenges encountered on projects and tell the story of tedious task automation and process improvement for architectural practice. A highlight was the opportunity to collect data on a very large healthcare project that I developed into a workflow for tracking Revit model metrics. The goal was to look for correlations between various model metrics and how long it takes to sync or open the model — one of the most significant factors of workshared projects because the extra seconds and minutes it takes to sync on a slow model multiplied by all the users on the project adds up to many hours of lost productivity over the course of the project. Dynamo is used to track the overall size of the .RVT file, query and count various elements and categories, parse the Warnings export file, then export all the information to an Excel file. In addition to collecting these general model metrics, the Dynamo task updated two additional spreadsheets with every warning in the model over time and every placed family in the model over time. All three of these of these spreadsheets were linked into Microsoft PowerBI along with data from imaginIT Clarity’s Model Metrics tool, which tracks the time it takes to open the model over time. Over the course of three months, I ran this Dynamo definition on a daily basis for a total of 68 exports.

The final takeaway will not be a surprise to those who are familiar with Revit model performance… the data revealed that Auditing and Compacting the model as well as Purging Unused Families had the most overall impact on the time it takes to open and sync the model. Although this may not be a significant breakthrough, these real-time analysis tools help monitor the health of the model and indicate when may be the best time to intervene.

The last step was to find an easy way to communicate the status of the model to the production team. Since it is the responsibility of the Model Lead on the project to audit the Central file, Warnings are the only characteristic that individual team members have the opportunity to impact. The project from which this data was collected happens to be a children’s hospital so we placed an image of a Minion on the Message Board with a visibility parameter tied to the number of Warnings. The final Dynamo task overwrites the Warning count parameter in the Revit model and the Minion changes accordingly. Now the team is aware that when they open the model at the beginning of a workday, if the Minion is purple the Warnings have exceeded 400 and some time needs to be set aside to resolve.

In the end RTC was an excellent experience. I thoroughly enjoyed sharing my perspective and bonding with my fellow colleagues from all over the world.

Special thanks to everyone who helped contribute to my work:
RTC & Committee
Shepley Bulfinch
Jim Martin
Jim Chambers
Jessica Purcell
Christina Tully
Margaret Gammill
PJ Centofanti
Jamie Farrell

Dynamo-litia Boston - April 2016

Dynamo is a proven tool for modifying Revit information and automating repetitive tasks, yet figuring out where to get started can be an intimidating process. Please join us for the first in a two part series on how to begin using Dynamo. We will focus on what it is useful for and highlight several introductory workflows that can be understood with everyday Revit knowledge. Since Dynamo is new to the majority of the local AEC community, we will discuss how regular project challenges can be opportunities to explore principles and grow knowledge. For those who are already experts of other software platforms, see how Dynamo has made the process of transferring geometry and information easier than ever before.

The video and presentation slides are available HERE.

More information at the Boston Society of Architects.

SketchUp to Revit with Dynamo


Despite the possibility of creating complex geometry in Rhino or logic-driven forms with Grasshopper or Dynamo, the vast majority of designers in corporate architecture firms use SketchUp for 3D modeling and visualization. A common request is for someone to "convert" a SketchUp model to Revit but by no means is this an easy task.

I was recently asked if there is a quick way to transfer a custom designed furniture item that had been modeled in SketchUp to Revit for documentation purposes. I discovered the SketchUp to Dynamo package created by Maximilian Thumfart and built a tool that converts SketchUp geometry to a new Revit Furniture family as a freeform object. This allows you to quickly populate Revit with elements from SketchUp and annotate.

THE PROCESS:

  1. Copy and paste only the geometry you want to export into a fresh SketchUp file.
  2. Similar to 3D printing, this workflow requires organized, continuos surfaces. It helps to break a model down to its shape in plan then trace the outline with smooth arcs and straight lines void of fractured line segments.
  3. Extrude and finalize the shape, it should remain ungrouped. Save the file.
  4. / 5. Open the Dynamo definition, browse to the SketchUp file, select the type of Revit Family you want to create (ie. Furniture, Generic Model, etc.), and click Run.
  5. In Revit go to the Architecture Tab > Component and place the family in the model. Lastly make sure to save the family into your project folder by clicking Edit Family and going to Revit > Save As > Family.

ADVANTAGES:

  • there are several ways to do this using default export and import tools in SketchUp and Revit, however the Dynamo approach is much quicker and bypasses the handling of multiple file formats.
  • cleaning up the SketchUp geometry will result in smaller and more efficient Revit families.
  • this approach could also work for larger scale uses like full building massings for design coordination and "tracing" with Revit elements.

DISADVANTAGES:

  • these families are NOT parametric, meaning they do not have adjustable dimensions or on/off visibility features. However by going to Edit Family; dimension parameters, on/off parameters, material parameters, and other constraints can be added post-process.
  • oftentimes iterative SketchUp modeling is a messy process. Unless the geometry you are attempting to import is super organized, this method may fail.

This approach may not be ideal permanent workflow but does the trick for representation and as a placeholder for proper families to be built at a later time. More importantly, it opens a lot of doors for larger scale design facilitation.

Common Selection Methods Using Dynamo


The Dynamo visual programming add-in for Revit enables advanced information gathering, rapid model changes, and repetitive task automation previously not available with the out of the box tools. Working with robust Building Information Models often requires surgical list management -- the act of gathering, filtering, re-structuring, sorting, or otherwise altering clusters of data or information. Of all possible list management operations, the ability to target and isolate information is essential. In my experiences using Dynamo I have encountered many methods for selecting and isolating model elements, parameters, and numeric values in Revit. The following are some of the most common approaches that I find myself using time and again.


SELECTION BASICS:

As an introduction to selecting items from a list I will be using the English alphabet (26 characters) as my dataset and I am searching for the letter D. Below are descriptions for four of the most common selection nodes. Notice how the output of each use a "true" or "false" value. This is called a boolean operation -- a computer science term for anything that results an either of only two outcomes (binary): true/false, yes/no, 1/0, black/white, etc.

  1. Contains - this node produces a true/false result to whether the list contains any occurence of the letter D (plugged into the "element" input port). Although the outcome is true meaning yes the list contains the letter D, this will limit us from being able to isolate this letter in later operations.

  2. List.ContainsItem - similar to the Contains node, this node also searches the entire list for the letter D. However by changing the lacing to Longest -- right click the symbol in the lower corner (see red square), go to Lacing, and click "Longest" -- the letter D is checked against every item in the alphabet and the only true value returned is at Index 3 where D resides. The downside of this approach is that a sublist is created for every item in the alphabet meaning that we would need to flatten the list -- collapse everything back into one list of true/false values -- before continuing on to further operations.

  3. String.Contains - this node is excellent for searching through lines of text for a particular word or select group of words. In this case since the alphabet list only contains a single letter at every index, this node can be used to find the letter D. However, if the list contained the names of fruit and we searched for the letter "a", the node would return true values for any word containing an "a" such as banana, apple, pear, etc. For this reason, using the node to search for singular items can be problematic.

  4. == (match) - the double equals sign is a symbol that comes from computer science where a singular equals sign indicates a math calculation therefore two equals signs means that something is "the same". This is my favorite node to use for selection operations because it will find matches for any input type whether a string, number, piece of geometry, or Revit element.

After using the contains/match nodes above to determine true/false values, the output can then be paired with the List.FilterByBoolMask node to split the outcome into two separate lists. In this case, using the == node generates a true value at Index 3 and false for all the rest. This list of true/false values is plugged into the List.FilterByBoolMask node as a mask to filter the original alphabet letters. The outcome is the letter D isolated into its own list.


SELECTING MULTIPLE ITEMS:

Oftentimes when working with a BIM model, you are looking to match multiple values at once. Building on the list management principles above, you can easily search the alphabet list for more than one letter. Inputing more than one letter into the == node requires that you switch the lacing to "Cross Product" (see small red square) because you are attempting to match multiple items against a list of multiple items, meaning that you need to pair all possible combinations. The result is a list of 3 true/false values for each letter in the alphabet since the == node is attempting to match the letters F, R, and X for each. Since we are checking for any match of those three letters, using the combination of List.Map and the List.AnyTrue node from the Clockwork package will comb through each list and identify whether any matches occur. The last step is to feed the new list of true/false values into the List.FilterByBoolMask node and the letters F, R, and X are separated from the rest of the alphabet.

I recently discovered a second approach to isolating a list of search items. Using the NullAllIndecesOf node from the SpringNodes package combs through a list and returns the Index number of any matching items. This can then be used in concert with the List.GetItemAtIndex node to extract the values from the original list at the matching indeces. This node works especially well if the list being searched has repeat instances of the values you are searching for. Also notice how there is no more need for the use of boolean (true/false) values as an intermediary step.


ADVANCED SELECTION - REVIT ELEMENTS:

The above list management principles can be applied to isolating and extracting elements from Revit. For example if you want to gather a list of all the chair families placed in a Revit model:

  1. The combination of Furniture in the Categories node and All Elements of Category will generate a list of all furniture families.
  2. Since we only want chairs, grouping the model elements by their family name will create organized sublists.
  3. The grouped sublists can then be searched for the word "Chair".
  4. Since the model elements are grouped in multiples of the same family, the combination of List.Map and the List.AnyTrue node from the Clockwork Package will check every sublist to see if any of the items contain a true boolean value for the text "Chair" in the family name. Another method for doing the same thing would be to use List.Map in concert with List.FirstItem, which would extract only the first true/false value from every sublist.
  5. The last step is to use List.FilterByBoolMask to filter out only the grouped sublists of Revit model elements that contain a boolean value of true.

One advantage of understanding list management principles is that tasks can be achieved from multiple approaches. Here is another variation of the above method for collecting all chairs in the model:

  1. Use the nodes Categories: Furniture and All Elements of Category to extract all of the furniture families.
  2. Get the name of each family and look for those that contain the word "Chair".
  3. Filter out all elements that did not yield a true value using the List.FilterByBoolMask node. As a means of verification you can insert some Count nodes to check how many families have been identified as chairs vs. other.
  4. Given that all of the model elements coming from the In output port of the List.FilterByBoolMask node, the final operation is to group all of the elements according to their family name. In theory this will get you the exact same results as the previous method.

There is an even easier way to select model elements, once again by using the NullAllIndecesOf node from the SpringNodes package:

  1. Collect all furniture families from the model.
  2. Use the NullAllIndecesOf node combined with the name of the families and the List.UniqueItems node to identify the individual indeces where matching items reside in the list and group them according to their shared family names.
  3. Feeding the sublists of indeces into List.GetItemAtIndex will extract the model elements from the original furniture list and group them accordingly.
  4. The last step would be to filter out only the chair family groups (not shown in the image).

Specific families or parameter values can be isolated using the == (match) node:

  1. Collect all furniture families from the model.
  2. Use the == node to compare a specific chair name against the list of family names to produce a list of corresponding true/false values.
  3. Filter out all of the Revit model elements that contain a true boolean value with List.FilterByBoolMask.
  4. Optional: apply a Count node to get the total number families placed in the model for that specific item.

Please keep in mind that these examples are only some of the methods for selecting and isolating items using list management and they may not necessarily be the best methods. Different tasks and model configurations will require different approaches but the more time spent practicing list management, the easier it will become to customize a solution for any problem.

For more on list management I highly recommend that you take a look at Chapter 6 of the Dynamo Primer . Also, check out this excellent post by LandArchBIM .

When 'Moneyball' Meets Ski Racing

This winter I joined an adult ski racing league and got back into the sport for the first time in over a decade. Each week racer performance is entered into a complex scoring rubric that determines overall team standings. After seven weeks of racing, this rich dataset was too tempting for a data visualization enthusiast like myself to pass up.

How the scoring works:

  • Each team consists of four members, they can be any combination of males and females.

  • Races occur once a week and two courses are set side-by-side, one red and one blue.

  • Every racer gets two runs per race, once on each course.

  • The racers are ranked according to previous week performance and are paired up by number, this means that you are racing an opponent on every run.

  • 2 Points are awarded if you earn a faster time than your opponent on each run. If your opponent does not show up to race that night and you succesfully finish your run, you receive an easy Win.

  • 2 Points are awarded if your time on a particular course is faster than the time you earned on that course (red or blue) the previous week.

  • Up to 4 Points are available if you earn a time within a given handicap time range. The breakdown of these times is as follows:

How the adjusted handicap time is calculated:

  • Every week, one or two pacesetters will ski the course before all the racers. Their times are divided by their individual nationally-certified handicap percentages to determine the best possible time on that particular course.

  • The fastest of the two pacesetter times is then multiplied against all of the other racers’ times to determine each person’s adjusted handicap time.

[To become certified, they attend a Regional Pacesetting Trials event where their top times are compared directly to those of US Olympian, Ted Ligety. To learn more about pacesetting, CLICK HERE.]

Back to scoring:

  • Each week racers can earn up to 12 Points for their team.

  • The top three highest scores from each team are taken and added to the team’s overall total.

  • At the end of seven weeks, the top 5 teams from each night of the week (Monday-Thursday) are qualified to compete in one final championship race. The highest scoring team wins.

At first the results were posted as paper print-outs which resulted in a few hours of manual data entry to build the initial database. After establishing a database, complex calculations were required to reverse engineer the scoring system and then emulate the score keeping based on each racer’s weekly results. Once again I turned to Dynamo’s visual programming capabilities to build a tool that could process all of the information.

Eventually the Nashoba ATR staff began posting the results to their website on a weekly basis which eliminated the need for both data entry and using Dynamo to compute all of the adjusted times and scoring. I could then quickly scrape the latest data each week and dump it into an excel spreadsheet where specific metrics were calculated. After using the handy data reshaper add-in for Excel, each spreadsheet was pivoted and then imported in Tableau for visualization. Now the performance of teammates against the rest of the league could be easily understood graphically:

As well as overall league standings:

The complexity of the scoring system made for a relatively fair, enjoyable, and competitive experience regardless of age or gender. It also created some very interesting visualizations such as the breakdown of scoring by age and gender for the first four weeks:

And the median age and time per race:

[Notice how the more “experienced” racers generally smoked the rest of the field.]

Ultimately the data helped me quickly understand the league and identify the nuances in scoring that could help improve my performance. It felt to get back out in the course after so many years, I cannot wait to do it again next season!

Shout out to my teammates and special thanks to Nashoba Valley Ski Area for generously posting results online and weekly updates. The original dataset can be viewed HERE.

Design Space Exploration with Dynamo


One of the most challenging aspects of the architectural design process is determining how to organize form to fit an overall parti. Facing endless possible geometric configurations, making sequential alterations towards a fitting result can be difficult without a means to measure suitability. During the initial phases of design research, an architect gathers essential information such as program requirements to meet a clients needs, zoning and code information for a provided site, environmental and material influences, and aesthetic preferences. These assets serve as the foundation for a constraints based design approach where parameters can be assigned in an effort to influence and control form.

Constraints in design are rules or vocabularies that influence form through the design process. An inherent feature of the architectural process is that design must be performed within a set of given parameters. Parameters help to focus the scope of an architect by narrowing the range forms and formal relationships may take within a design solution... Constraint based design takes the parameters associated with a design problem and links them to the attributes of the formal components and relationships of a solution. (Dustin Eggink, http://goo.gl/EktbQ1 )

Dynamo is an ideal platform for constraints based design because the visual programming environment allows you build a parametric model that can be quickly adjusted with changes to input values.

Once you have a functioning Dynamo definition, all of the nodes can be consolidated into one Custom Node by dragging a selection window over everything and going to Edit > Create Node From Selection. This will transition everything to the custom node editing mode -- you can always tell when you are in this mode because the background is yellow.

To create a custom node, the first step is to give it a name, description of what it does, and category (where it will be saved in the Library). All of the input number blocks (far left side) must be swapped out for Input nodes, generally named for the variable they represent. Output nodes also need to be added after the final nodes in the definition (far right side) that are providing the finalized geometry. When these steps are complete, save the node. Back in the Dynamo node space -- also known as the canvas -- number sliders can be added to the newly-created custom node. It is helpful to click the down arrow on the left side of the node to set the minimum, maximum, and step interval because large numbers can take awhile to process or crash Dynamo while zeros will often create null values and turn the majority of your definition yellow with warnings. Now you have a fully parametric custom node that allows you to explore a range of formal configurations with the simple adjustment of number sliders.

Developing custom nodes for form making allows for use with the Dynamo Customizer -- a web-based viewer currently in beta for viewing and interacting with Dynamo models real-time. This platform has a lot of potential for sharing designs in the future and allowing colleagues or clients to experiment with their own manipulations of the design.

Check out this example for the twisting tower here: Dynamo Customizer - Twisting Tower.
DISCLAIMER: you will have to request Beta access and sign in with your Autodesk ID to view this. For step-by-step instructions, visit: http://dynamobim.org/dynamo-customizer-beta-now-available/.

After guiding parameters have been established, a design space can be generated for testing all possible variations of a few select variables of a design. Design space exploration is a concept involving a virtual -- or non physical -- space of possible design outcomes. This allows the designer to simultaneously see a wide range of options and extract only those that satisfy pre-determined criteria of fitness.

The core essence of this workflow is the use of Cartesian product which facilitates comparison of all possible pairings of variables. This mathematic operation can be understood as an array of combinations between x, y, z and 1, 2, 3 (below left) or as a slope graph of all possible correlations between the two lists of variables (below right).

Using the List.CartesianProduct node calculates all possible combinations of the number range values however all of the geometry is instantiated in Dynamo at the origin point, making it appear that only one object was created even though the count shows 132 (below left). Thanks to Zach Kron and the Design Options Layout node from the Buildz package, the nested list clusters of geometric objects are arrayed according to the Grid Size spacing value (below right). Using list management logic -- such as List.Transpose, List.Map with List.Transpose, etc. -- before the Design Options Layout node will re-arrange the list structure and result in different compositions of objects.

To set up a design space in Dynamo, the inputs to the custom node are fixed values. Whichever variables that you want to test must be left empty on the custom node and number ranges are connected to the List.CartesianProduct node. The number of list inputs in List.CartesianProduct must match the number of inputs left open on the custom node. It is also important to note that the list order for the number ranges in the List.CartesianProduct node must correspond to the same order of inputs in the custom node. The total number of values from each number range will not only determine the scale and form of the resultant geometry but count of values in each list will determine the overall size and shape of arrayed objects -- this is critical to remember because an excessive number of input values may take several minutes to process or potentially crash Dynamo. After the ranges of values have been set up, the List.CartesianProduct node is connected to the Design Options Layout node which arrays all possible combinations in 3D space. Depending on the geometry being tested, the Grid Size input determines the spacing between objects. When everything is connected correctly, Dynamo will display an array of forms which can be altered by changing the number range inputs and re-running the definition. If Dynamo crashes, geometry disappears, or there is an insufficient amount of variation in the forms, continue to calibrate the number ranges and explore the limitations of the parameters in your custom node.

A successful design space arrays all possible options along two or more axis utilizing the concept of dimensionality. Design space is theoretically unlimited, however the visualization of the virtual design space is limited to the constraints of graphic representation. Color can be added to provide visual differentiation of a third dimensions such as the analysis of generated outcomes, or could represent any of the associated properties of variables. Criteria for evaluation of fitness refers to the means by which the best solution is determined.

For example, a calculation of height of twisting tower forms can be colorized on minimum to maximum gradient (below left). Another representational technique is selective omission or hierarchical modifications to the representation (below right).

Ultimately design space and its subsequent representation is nothing more than a tool, designers still have to make decisions. Design space should function as a method of exploration to make informed, confident, substaintiated decisions.


Portions of this blog post were developed in collaboration with Jamie Farrell for our course Advanced Revit and Computational Workflows taught at the Boston Architectural College.

Dynamo-litia Boston - January 2016

As the Dynamo visual programming add-in for Revit continues to emerge as an essential tool for design and production, questions surrounding justification and strategies for implementation have arisen. Recent topics for discussion include the emerging role of computational design, open source vs. proprietary development, and funding innovation. The advantages and capabilities are well documented but what sort of impact can be expected on staffing and project planning? Please join us to hear more about how Dynamo is being managed in the local community and contribute to the ongoing conversation of how firms can better position themselves.

The video and presentation slides are available HERE .

More information at the Boston Society of Architects .

Canstruction

Image courtesy of Hung Pham

Image courtesy of Hung Pham

Canstruction is an international competition where “cansculptures” are created using cans of food, which are later donated to local hunger relief organizations. This year, the Houston office of Shepley Bulfinch assembled a team to take on the challenge.

After coming up with a design concept, the task of estimating the number of cans and positioning for structural stability is daunting. Since cans of food are a modular unit with standardized dimensions, this is a perfect opportunity for parametric design using Dynamo.

PREPARATION
The first step was to figure out how to populate cans along a surface. Following a visit to a local grocery store for research, the height and diameter of a chosen can were entered into Dynamo to establish the base module. An undulating vertical cylinder was created to a height of 6 Feet and cross-sectional circumference curves were cut based the height of the can.

Points were then placed along the horizontal curves at a repeated distance of the width of the can. In order to ensure structural stability, the last step was to offset the location of the points along every other curve by the distance of half a can width so that they would be perfectly staggered to land at the point of intersection of two cans on the level below.

DESIGN
For the final design concept, the character Blanky from the animated classic The Brave Little Toaster was chose to help convey the message “Stitching Away Hunger!”

The initial 3D object to visualize the Blanky’s body was created using NURBS in Rhinoceros and then imported into Dynamo using the Mantis Shrimp package from Archi-lab . Initial attempts at generating curves from the Rhino surface resulted in a significant loss of definition of features due to the NURBS curves in Dynamo rounding off sharp corners between points.

MODIFICATION
It was determined that a better approach would be to model the concentric curves, similar to a digital topography model. The geometry was transitioned to Revit where contour lines were generated at a vertical spacing of the height the can. As splines in Revit, the contour lines could be easily adjusted to maintain the precision of the shape’s features.

Dynamo facilitated a seamless iterative design process as the contour curves were adjusted in Revit then queried in the definition to populate with cans. Upon visual inspection of the resulting 3D geometry in Dynamo, further adjustments could be made to the curves in Revit to perfect the overall shape.

REINFORCEMENT

Cans tend to become unstable after a certain number of stacked layers so a common practice is to add a thin layer of supporting material at regular intervals to provide a firm horizontal surface. The team chose 1/16” medium-density fiberboard (MDF) as their horizontal reinforcement layer at every level of cans and added supplemental contour curves in Revit to simulate the 1/16” spacing. The curves were picked up in Dynamo and populated with an extruded surface to and visualize the addition of supports, as well as calculate the increased overall height.

To emphasize the distinction between the character’s cape and the void below, the decision was made to use taller cans of a different color for the lower central portion of the can sculpture. In order to account for this design alteration, the contour curves were strategically split in Rhino and – at areas to be substituted with larger cans – contour lines were removed to account for the increased height.

TAKEOFFS

One of the most significant benefits of designing a Canstruction sculpture with computation is the ability to perform an instantaneous takeoff of cans and materials. For this cansculpture, the final count came to 1763 short cans and 120 regular cans. In addition, there were 42 layers of support material, which ended up requiring [60] sheets of MDF. These numbers were critical for designing within budget and placing supply orders.

Image courtesy of Billi Jo Galow

Image courtesy of Billi Jo Galow

TEMPLATES

Linework from Dynamo can be pushed back into Rhino or exported as an SVG file. In preparation for build day, the assembly team printed out full-scale templates at each level of the cansculpture, which allowed them to quickly place cans on top of the template, eliminating most unforseen variances and the need for improvisation in the field.

Image courtesy of Billi Jo Galow

Image courtesy of Billi Jo Galow

TEAM

Congratulations to the Shepley Bulfinch – Houston team for constructing an impressive cansculpture for a good cause. It was a pleasure to assist with the use of Dynamo and interoperability among several software platforms for simulation and delivery of their design. They demonstrated that visual programming is easy-to-learn and invaluable tool for integration into the design and documentation process, whether producing a sculpture made of cans or an entire building.

Claudia Ponce
Hung Pham
Julie Truong
Billi Jo Galow
Sandra Bauder
Stan Malinoski

Parametric Nameplate


In an effort to become better acquainted with geometry creation in Dynamo, I created a parametric nameplate. The design concept was a cluster of “randomly” sized squares to give the appearance of a pixelated brick.

To start, a solid array of points was generated.

To create the illusion of “randomness”, points were removed at random from the array using list management and the List.RemoveItemsAtIndex node. Solid cuboids of varying size were then instantiated at the remaining points.

Lastly, 3D text solids were generated and placed centered at the front of the cluster of randomized squares. By using a boolean difference function, the text was subtracted from all of the intersecting cuboids. With the boolean union all function, the resulting shapes were all conjoined to form a unified solid.

In preparation for 3D printing, I exported the final solid as an .obj file to Rhino. Tools such as _Check and MeshRepair in Rhino are excellent for quality control and making sure that an object is “watertight” - no holes or manifold edges - so that it will successfully print.

Whether sculptural objects or architectural forms, many designs contain cantilevered elements that are difficult to print with additive technologies such as a MakerBot. In this model, the variation in stacked cuboids created several locations where cantilevers occur. Autodesk makes an excellent application called Meshmixer that can help with this problem. Meshmixer offers the ability to create, modify, and analyze the fidelity of objects for 3D printing, however one of the most useful features is advanced supports. The model is evaluated and automatic, branch-like structures are generated to meet the underside of cantilevered objects.

The supports allow for flawless printing and easily break away from the model after the print is complete to reveal perfect cantilevers.