#AU2024 wrap-up

Well, it’s a wrap! Autodesk University consolidated my career in BIM back in 2016, and I hadn’t participated in it in seven years. Much has changed since then. I’ve changed, and the world has, but one thing has stayed the same: the convention is still an incredible place for networking, learning and brainstorming about the […]

Well, it’s a wrap! Autodesk University consolidated my career in BIM back in 2016, and I hadn’t participated in it in seven years. Much has changed since then. I’ve changed, and the world has, but one thing has stayed the same: the convention is still an incredible place for networking, learning and brainstorming about the future. Sure, it has its flaws. But even within Autodesk and its ever-growing oppressive monopoly, there are spaces for a community to flourish and develop solutions. So, here’s my wrap-up with some notes on what went down this year. But first of all, a disclaimer.

The disclaimer

The available content during the three-day convention is overabundant, and it’s impossible to access everything while on-site, but the digital catalogue has grown to be a very valid complement (provided people actually put an effort into producing handouts, which isn’t always the case). Also, I suggest you keep an eye out for other articles on the conference, such as these 18 highlights by Nicholas Catellier, the recap by Daniel Stene here, or Kean Walmsley‘s highlights here and here. And raise your hand if you’ve seen something you think it’s worth mentioning! The only way to gather value is to bring it together.

That being said, let’s talk about AU.

This post is split into sections:

  1. The location and venue, featuring some notes on San Diego;
  2. Hackathons and the Community Zone with a focus on the Dynamo Hackathon, LEGO and the Puppies;
  3. The Keynotes;
  4. The Expo Arena, featuring Datagrid, the Volterra-Detroit Foundation, and a surprising return of Maya, but also the Fireside Talks with Developers;
  5. Classes featuring topics such as:
    1. Community Building;
    2. Structural and Infrastructural Revit;
    3. Revit and Dynamo Tips;
    4. Rhino to Revit;
    5. Sustainability and Carbon Footprint;
    6. Digital Innovation and Implementation Processes;
    7. Artificial Intelligence, which is the buzzword of the moment;
    8. Participatory Planning, as I was the only one with LEGO but I wasn’t the only one with this topic.

Did you see a great class and you can’t find it mentioned here? Drop it in the comments? And maybe we’ll see each other in Nashville (which is an awesome place and I talked about it when I was here for this other thing).


The location and venue

I was excited to come back to San Diego — I had been there back in 2019 for the Esri User Conference — and I think it was a good choice for a venue after many years in Vegas and the… not-so-happy experiment of Saint Louis two years ago. The city is friendly, the weather is wonderful, the convention centre is relatively well-serviced.

Though apparently the industry’s influencers are a group of lemmings…

My only significant complaint has to be the weird layout and size of classes, which filled up very fast and still remained half-empty. Something was off with the reservation system, I think.
I would have more to say about the town, its attractions, about seals and model trains, but I think I’ll do a separate post on that, if I’ll ever find the time.


Hackathons and the Community Zone

Autodesk University always starts the day before it does, everybody knows that, but this year Autodesk went full-in and decided not only to have the usual preliminary activities on October 14th (namely the Dynamo Hackathon, more on that later) but also the reception party. Which is a bit weird, if you consider that the ending party also was on the last day. It’s almost as if they were trying to cut down on the number of participants. I hope not, because mingling and networking is half the value of the conference. Anyway, the highlight of day 0 undoubtedly was the Dynamo Hackathon, hosted by the one and only Sol Amour.

I’m not in a mental state that’s suitable to participate in hackathons, and I haven’t coded in years, but this was a cool event to attend to and, based on what I’ve heard, it provided the perfect mix of innovation, fun, and unconventional approaches to scripting. You can read Patrick Poyden summing up the work of his group over here: they did a Dynamo Game that tested your knowledge on Revit and… erased stuff from your model if you screwed up. They won the “most fun” prize, though I think the should have won the “most evil” prize. Which, if you know me, you’ll understand is a very high compliment.

On a more serious note, the day was a chance to announce the Beta availability of Dynamo as a Service (DaaS). The other teams were the InFORMAnts, who created a prototype to generate Dynamo graphs through an LLM from a prompt in natural language (special mention for innovation), the ?ext:daas who developed some Civil3D-Forma-Revit interoperability, the Utah Teapot Programmers (enters the “I understood that reference” meme), and the EcoMatic led by the awesome Enrique Galicia (you can read his wrap-up here).

The second highlight of the day was the opening of the Community Zone, where the Autodesk team really went above and beyond with spaces for people to meet and a special corner to catch up after classes (though, with the convention centre organized linearly and classes potentially very far away from the central plaza, I wonder how many people actually used that). One of the most popular corners had to be the LEGO table, a long cornucopia for people to pimp their badges with mini figures and, at least during the next few days, to collectively build stuff at the centre.

The second highlight undoubtedly was the puppies’ corner. Yeah, you heard me right. Back in my days (god, that sounded old), Autodesk University in Vegas had a corner where you could unwind and try to work on your childhood trauma with specialized dogs. This time, they tuned up the game by bringing the puppies in. People stood in line for a very long time, got into the fence and sat down, waiting to be chosen by a puppy and cuddle them. It was just about the wholesomest thing you could see.

Thirdly, the Autodesk merchandise booth had a pretty significant line of people wanting to buy branded socks or a backpack. They used to give you one of those with your badge, but I guess Autodesk can’t afford it anymore with the prices of licenses going down and… oh, no, wait.
Sarcasm aside, I stood in line, too. I had to get my hands on a significant piece of merchandise, and I’m not ashamed of it.

Behold… the otterdesk.

There was also an arena for Design Thinking workshops, and I had the pleasure to attend one on the “future of work” and, although the time wasn’t enough to dive deep into the topic, I’m always fascinated by how design thinking is able to foster new ideas in a diverse group of people. We had some interesting insights around topics such as training, learning, certifications and quality control. Here’s my storyboard, based on an idea from my partner in crime.

So what else, what else was going on outside of the Community Zone?


The Keynotes

The first day of Autodesk University always starts off with the keynote and I’m sorry to say this but I was struck by how tone-deaf this session was. It’s possible I’ve changed too much to enjoy this. It’s possible people have changed too, as the “make some noise” hails were met by embarrassed silences. Sure, Andrew Anagnost managed to get a few laughs out of us, but I can’t really connect with what he’s saying when he complains about his self-driving car and his mocha latte (an abomination in itself), and he shows an AI-generated picture of himself without addressing the main concern, which is the stereotype of the white dude. If the purpose was to address that stereotype with some self-irony (which is possible; he’s been known for that), this time, he failed spectacularly.

Focusing on the content, Anagnost made some bold claims about Artificial Intelligence, which was one of the peak topics for the conference, as you might expect, and I understand a keynote is not the moment to get deep into a topic, but saying AI is during its peak of inflated expectations (with an obvious reference to the Gartner hype cycle) isn’t a supported statement.

If you’re not familiar with it, this is the Gartner hype-cycle for emerging tech, developed back in 2022.

If you’re not familiar with it, the Gartner hype-cycle for emerging tech was developed back in 2022 and illustrated how a technological solution goes from its introduction trigger through a steep development period up to a point where people expect it to solve every world’s problem, plunging down to a dark pit of disillusionment and eventually reaching its full potential (which is lower than the inflated hope).

Placing Artificial Intelligence on the Gartner Hype Cycle presents several challenges, primarily due to the complexity and diversity of AI technologies and their varying stages of maturity across different sectors. AI encompasses a vast range of technologies and applications, leading to varied experiences among organizations. Different enterprises are at distinct stages of AI adoption, making it difficult to categorize AI’s overall progress on the cycle: as Gartner analyst Bern Elliot noted, Artificial Intelligence has already reached the “plateau of productivity” for many solutions, and it’s all too easy to blame one’s shortcomings on the technology itself. Additionally, generative AI in particular isn’t a solution but it’s the proverbial “hammer looking for a problem.” My feeling was that, in framing AI as “emerging” (which it hasn’t been for years now), Anagnost was making up for his own company’s shortcomings in introducing it and implementing it, leave alone developing their own model, as the smartest thing you can get by now is a little toy called Project Bernini, which generates a 3d-shoe based on your bad sketch for a shoe.

Picture by Kean Walmsley.

The second keynote by Autodesk’s Executive Vice President and Chief Technology Officer Raji Arasu was even more concerning when it came to Artificial Intelligence. As I mentioned in my wrap-up of day 1, the approach is as troublesome as you might expect. The company is apparently looking to broaden the scope of their Autodesk Assistant, currently only available in AutoCAD as a pesky clippy-styled assistant suggesting you better ways to use your software (while we only know that the only good AutoCAD is a closed AutoCAD). With its scope expanding into Revit or the Construction Cloud, it begs the question… are they using our data to train the model? Apparently the answer is yes. And though I understand and appreciate a positive approach to this, such as the one Steven Shell proposed in the comments of my LinkedIn post on day 1, we’re far from the days where you would volunteer your information for everyone’s sake: opting in should be a willing choice and, maybe, rewarded in some way.

More information on Autodesk and Artificial Intelligence is here. Enjoy. Or not.

Their approach for transparency in Artificial Intelligence was advertised through the introduction of “transparency cards” (more on that here) and there isn’t one for Revit or for the Construction Cloud, as the systems are still under development, but there is one for Dynamo. It’s beyond ridicolous.

It’s an autocomplete. We know it’s based on a Predictor Technology.

Oh, and also… Forge changes its name

Rebranding has always been a problem for the new generation of Autodesk products, and apparently Forged isn’t immune. The new name is Autodesk Platform Services, and it’s pretty much the same old thing: visualization, automation, ACC capability extensions.

We do have some new features, though, such as the Content Catalog feature in the Construction Cloud. Do you remember BIM object? Well it started pretty much from something like that, and than it got integrated.


The Expo Arena

After the keynote, the exibitor’s expo traditionally opens up its doors and, amongst old acquaintances and friends, here’s my top-three.

1. Datagrid

I had the pleasure of talking with Hitansh Nagdev at the booth and he showcased some pretty fancy solutions. Connecting with the Autodesk Construction Cloud, the system crawl through your document and either suggests queries (not very differently from what Appsheet does with Sheets from Google Drive) or respond to qualitative inquiries such as “group markups by category”, summarize documents, suggest links with bits and pieces of knowledge that might be spread around your Common Data Environment.

The downsides? Two of them. You can’t really get what it does based on just the website, and it runs on credits but isn’t really clear on the consuption of them, which makes it difficoult to develop a budget. They promised they’re working on the second part, but I would really like to develop some more inclusive tests.

They also taught a class and you can find it here.

2. The Volterra Project

You might remember it from a few years ago: a group of incredible professionals, led by Paul Aubin and Andrew Milburn, surveyed Volterra through laser-scanning and a series of workshops from October 2016 through to 2019, 2022 and 2024.

In addition to having produced spectacular photo-realistic virtual replicas which may be used to share the experience of the city to anyone in the world, and for the city to have precise documentation for potential future reconstruction, the data has been used for research on ancient architecture which has resulted in significant discoveries and have been presented at several prestigious international archeological conferences.

The next International Digital Preservation Workshop will land on these shores on March 30 – April 11 2025, more information here, and they were present at the conference with a spectacular central booth. It featured the pointclouds, the finished Revit models, the 3d-printed models of key buildings that were involved in the survey, and a special project called Volterra in Time, in which the models were thrown into Unreal to use the game engine for an immersive experience. The workflow was discussed in this class over here: The Volterra Challenge: Workflows from Autodesk to Epic by Brey Tucker. Here are some more pictures.

3. Maya

Yeah, you heard me right. We all thought Autodesk bought Maya to kill it, but this year I saw a revamped interest in the software on two fronts: procedural modelling (which is their way of saying “parametric”), and automatic rigging. This is an old feature that never worked but apparently they bought a new startup and its AI-assisted rigging works like a charm both on 3d models and to replace actors with 3d characters in a filmed scene. You can take a look here. There was also a nice class on the implementation of AI into Maya, and you can find it here.

Also, the Expo Arena had little stages hosting occasional talks and the Fireside Talks with Developers, a new format, or at least it was new to me. I must say, really enjoyed it: developers from software like Inventor, Autocad and Revit (actual developers, people who were either involved in the development since its conception or still involved) were sitting in a lineup on the stage and people from the audience stood in line to roast them with questions on a microphone. Maybe an actual fireside talk, with a different layout, would be an improvement. Still, it’s not something we’re able to use every day and we got an insight on how important it is to ask features on the Autodesk forum, as the developers’ hands are pretty tied by commercial reasonings on what actually to implement on the software. Which explains a lot.

But Autodesk University is mostly about classes, so let’s take a look at them.


Classes

Since I myself had my class on the first day, I didn’t attend as many classes as I could have and kept a chunk in my schedule for some final tweaks on the slides and a last-minute recap. Also, there are too many classes for you to follow, but I did take advantage of the digital catalogue, though, so here are some extended highlights of some of the cool stuff that was going on.

Community Building

Tiled Hosting BIM Titans: Lessons from 80 Live Talks and 700,000+ Views, this session by Nicolas Catellier was the one I skipped because I was anxious about my own, and I’m really looking forward to seeing its recording, as it was a class about community management and content creation. Nic is the host of the BIM Pure Live series on Youtube, which launched in 2020 and has a rather unique format that blends podcasts and webinars to engage a global audience on topics such as Revit workflows, AEC technology trends, and emerging BIM-related software. The show grew rapidly due to its consistent engagement with viewers and insightful content, covering various topics related to Revit, artificial intelligence in BIM, and advanced architectural technologies. The diversity of episodes has been crucial, featuring experts in niche areas like Revit landscape, detailing, scheduling, and concrete modelling, to name a few. Catellier did a lessons learned from this successful project. Judging from the handout, but mostly from the presentation, these are the takeaways:

  1. Challenges of getting started included technical issues, a lack of public speaking experience, and the financial demands of producing quality live content;
  2. The show has been successful due to the calibre of experts featured, such as Purvi Irwin on Revit Schedules, Marzia Bolpagni on LOD vs Level of Information Need, and Michael Kilkelly on Revit API with C#;
  3. One of the distinguishing features of BIM Pure Live is its focus on interactivity, making it more dynamic compared to standard corporate webinars. Instead of following the typical webinar structure of uninterrupted presentations followed by a Q&A, the show fosters an informal, conversational atmosphere, as Catellier aims to recreate the feeling of friends discussing BIM over a pub talk, asking questions throughout the presentation and encouraging audience engagement via comments and live interactions;
  4. The concept of “state changes” is introduced to avoid monotony—frequent shifts in tone, format, or activity keep the audience alert and interested. These changes can include reading comments from the chat, making jokes, asking spontaneous questions, or using interactive software tools such as OBS (Open Broadcaster Software).
  5. It’s underlined the importance of striking the right balance between being an instructor and an entertainer. Successful public speaking is a blend of both; if you’re too focused on delivering information without being engaging, the audience will tune out, but being too entertaining without substance results in a lack of meaningful content.
  6. Technical aspects are also covered, such as the use of a mailing list as a lead asset not to rely on social media algorithms, which is something many of us have grown to overlook throughout the years, and the need for consistency in planning content, which is something I often struggle with.

“Anything worth doing,
is worth doing right.”
– Hunter S Thompson

The final section of the handout discusses the concept of leverage in the internet era. Catellier explores how the internet has made it possible for creators to amplify their impact through digital tools like YouTube, blogs, and social media. He compares the reach of a private training session with 10 people to a YouTube video that can be watched by 100,000 viewers, demonstrating how internet platforms offer unprecedented scalability for sharing knowledge, and encourages others in the BIM and AEC fields to take advantage of these opportunities, share their expertise, and maximize their impact by leveraging tools that have no marginal cost of replication, such as videos, blog posts, and open-source projects. And, albeit struggling with videos, I rather agree.

Another session on the similar topic was titled Preconstruction Leaders and Managers Meetup, and held by Mark Austin from Autodesk, and I usually avoid those (sessions from Autodesk employees, I mean) and it wasn’t a class, it was an actual meetup. Questions asked were:

  • What are the biggest struggles that precon teams are facing right now?
  • How and what are precon teams doing to remove inefficiencies in their current departments and companies?
  • How do we keep precontalent safe from burnout and/or turnover?

And, of course, stuff like What gets you excited about the future with Autodesk?, which bring down the value of the whole thing. Still I’d like to hear from people who participated, to understand how that went.

 

Structural and Infrastructural Revit

Classes by vendors are another thing I try to steer clear of, but Graitec often proves to be the exception. The class Advanced Engineering Practices in Revit, Bridging Structural Concrete Design, Rebar Modeling, and Detailing for Buildings by Stevens Chemise and Daniel Gheorghe was another proof and I suggest you take a look at their luxurious handout if you don’t believe me. The class was a lab, and there aren’t additional materials available this time around, but it looks very well structured.

Participants went from launching a structural analysis to using the PowerPack Design add-on to enhance results.

The second lab I’d like to highlight is the Modeling Bridge Superstructures in Revit in a Scalable Way session by Terje Fjellby and Thomas Østgulen by Norconsult Norge AS. It started with some sensible set-up of coordinates in Revit, proceeded down from there with importing the topography in Civil3D, creating the alignment and getting the points, which then get thrown back into Revit. It’s nothing new to me, but it’s nice to see someone else doing stuff in the same way. Of course, they propose their own tool to do it, but it’s pretty much the same thing.
Some other cool functionalities include drafting a cross-section (which is always tricky when it comes to non-straight infrastructure, and god knows life is never straight), and place adaptive components which, again, is part of a usual standard workflow. From the handout I can’t understand which category is picked for the superstructure, but I encourage you to take a look at the handout for some bonus exercises on coding and some tool-building.

If you’re into structural Revit, however, you know you don’t have to miss it when the one and only Marcello Sgambelluri steps into the field, and one of his classes this year was specifically focused around this: The Structures Lab: Learn How to Model with the New Revit Structural Analytical Tools. With his unique blend of showmanship, competence and enthusiasm, Marcello guided participants through new ways of dealing with the structural module in Revit, one of the areas where the software has seen the highest improvement.

Revit 2023 software has completely overhauled the analytical modeling workflow. Now the physical and analytical models have the freedom to act independently. The analytical model elements must be created separately from the physical structure; however, you can still “link” them together. This ultimately gives better control over the analytical model and what it is allowed to represent. These changes are drastic, but don’t worry. We’re here to help explain how it all works, and have you give it a try in this hands-on lab. You don’t need to be a structural engineer or a Revit structural modeler, because we’ll show you how you could use the Revit analytical elements to help in the creation of physical walls, physical beams, and many other fun creations. In this lab, we’ll also show you how to create your own workflows using Dynamo to customize the automation of the creation of the analytical elements.

The Handout is here and, as usual, is neatly organized with one-page sheets for each exercise, and they include:

  1. Create Analytical Beam on Stair Stringer;
  2. Create Analytical Panel on Stair Stringer and Send to Robot;
  3. Create Analytical Single Floor;
  4. Modify Analytical Member via Hosted Point;
  5. Auto-Create Analytical Member from Physical;
  6. Create Analytical Members At Top of Steel;
  7. QA/QC Your Structural Beams W/ Analytical;
  8. Auto-Create Physical Members from Analytical;
  9. Create Physical Members At Top of Steel;
  10. Using Analytical to Model Physical Beams;
  11. Connect Revit with Robot Analysis (using the head of a dragon).

After the Sheet section, the Handout also has a “Very Wordy Format” for those of us who want to actually read stuff.

 

Revit Tips

Forget the handout for the class Optimizing Projects in Revit for Factory and Manufacturing Design Packages, which was delivered by Kristin Lorentzen (Design Manager) and Clinton Giles (BIM Designer), both from Tesla, because I’m sure they have their excuses, but the document is frankly non-existent. Judging from the presentation, though, there are some nice takeaways from this class showcasing the Revit strategy at Tesla.

  1. Revit’s Multi-Project Management Features:
    1. Linked Models: Revit allows multiple projects within a single factory environment to be represented by linked models. This ensures that different teams working on separate factory areas or projects (such as expansions or new installations) can coordinate their efforts while keeping models interconnected. This is particularly helpful in environments where factory sections are interdependent, and worksets wouldn’t be suitable.
    2. Modular Design: Revit’s modular approach enables the breakdown of a factory into smaller, manageable sections. Each module (or area of the factory) can be designed, updated, and optimized independently, while remaining connected to the larger factory model.
  2. Worksets and View Templates:
    1. Worksets: in large models, especially in factory design, managing visibility and access is key to performance. Worksets allow different parts of the design to be isolated and worked on independently, minimizing model complexity and improving workflow. Teams can assign worksets to different disciplines or tasks, ensuring that only necessary elements are loaded, improving performance and collaboration.
    2. View Templates: these streamline the creation and management of views by predefining settings such as visibility, graphic overrides, and level of detail. Using view templates, teams can standardize views for different projects, ensuring consistency and reducing the need for manual adjustments.
  3. Automation with Ideate:
    1. Ideate Clone: this plugin automates the cloning of views and sheets for new projects, speeding up the creation of permit packages. It automates repetitive tasks like renaming views, project numbers, and sheets, which not only saves time but also ensures consistency across different projects.
    2. Efficient View Management: by utilizing Ideate tools, teams can manage view filters and simplify complex models, making it easier to focus on specific areas or projects within the larger factory environment.
  4. Centralized Standards and Templates:
    1. Standardized Templates: maintaining a set of templates for project setup and naming conventions ensures uniformity in documentation and design standards. This is particularly useful for multi-project environments, where multiple permit packages must be generated from the same model.
    2. Naming Conventions: standardized naming across different disciplines helps in organizing large Revit models. This ensures that each project has a unique identifier, allowing for the efficient creation of views, sheets, and permit packages without confusion.
  5. Real-Time Collaboration Tools:
    1. Autodesk Docs and BIM 360: the cloud-based tools enable real-time collaboration between different stakeholders, including architects, engineers, and contractors. The tools allow for live updates and coordination across different teams working on various parts of the factory model, which is critical in large-scale, multi-project environments.
    2. Scenario Planning: the ability to simulate different design scenarios within Revit allows teams to test various configurations (such as reconfiguring production lines or expanding sections of the factory) without needing to start from scratch. This feature speeds up decision-making and helps optimize factory layouts.
  6. Challenges in Managing Large Models:
    1. Model Complexity and File Size: Revit models containing multiple projects can become large and unwieldy, leading to performance issues. The presentation suggests utilizing worksets, linked models, and view templates to minimize the complexity and improve performance.
    2. Coordination of Revisions: factories undergo constant changes, requiring careful coordination of revisions. Revit’s revision management tools and the use of cloud collaboration tools ensure that changes are tracked and communicated efficiently across different projects.
  7. Permit Package Creation:
    1. Efficient Sheet and View Set Automation: using predefined sheet sets and automating the export and publishing process ensures that permit packages are generated quickly and accurately. This feature is particularly useful in factory environments where multiple permit packages may need to be generated simultaneously for various ongoing projects.
    2. Version Control: managing ongoing changes across multiple projects is challenging, but by streamlining the versioning process, teams can reduce the risk of errors and maintain consistent documentation across projects.
  8. Other Tools:
    Acoustic Panels and Collaboration Tools: the presentation mentions utilizing accessories like acoustic panels to reduce noise in the workspace and tools such as the Elgato Wave Mic Arm and Stream Deck to enhance productivity during live collaborations.

And before you ask: yes, this was done by uploading the presentation into ChatGPT. It took a whole total of 2 minutes between uploading, reading and formatting. The fact that they presented a 2-page handout is inexplicable to me. Overall, it seemed a nice basic class on some advanced Revit functions, and we need those as well.

 

Then again, if you’re looking for Revit (and Dynamo) tips, Marcello Sgambelluri is usually there for you, and this year is no exception: his Off-the-Wall Revit, Revit Family, and Dynamo Tips was a cornucopia of tricks, spanning from basic stuff like measure any distance, isolate objects and working in hide view to more advanced stuff like creating a beam on a complex surface, the analytical side of beams, and fun stuff like how to create tire tread in your Revit model of a car (because why not), Adaptive components and some Dynamo to fetch DWG Imports or Links, to deal with Meshes in Families, with Complex Topography (on a guy’s head), and, finally, some fancy dynamic Clash Detection with Twin Motion.

Rhino to Revit

In his Connecting BIM and Design: The Heatherwick Process, Alfonso Monedaro showcased, through the Olympia project, Heatherwick’s approach to integration and automation, diving deep and BIM goals and what’s needed / good to have when it comes to LODs, spanning from Design to Construction through Documentation.

The design process and BIM implementation are often disconnected and don’t develop in parallel, making BIM tedious and meaning that full adoption is delayed until stage 3. Break the silos and create a seamless workflow between BIM and design teams from the start of the project and delay your design freeze.

The Handout, which I highly recommend, showcases the integrations with a heavy accent on Rhino being used in the conceptual stages, which seems to be something we’ve renounced fighting against. It doesn’t just show a geometry translation script (every Jim and Joe can do that, these days) but an accurate model management strategy (page 8) and geometry standards based on the desired output.

The slides (here), which I recommend even more than the handout, follow the latter with the addition of many pictures from the projects and, most importantly, an in-depth section on the interoperability strategy, and I can’t wait for the recording of this class to come out.

Sustainability and Carbon Footprint

The class LCA at Herzog & De Meuron: Building an Integrated Digital Workflow outlined the process of developing a customized Life Cycle Assessment (LCA) pipeline tailored to the architectural practice of Herzog & de Meuron. The pipeline addresses the limitations of traditional LCA tools, particularly their inability to integrate seamlessly into different design phases and with varying BIM standards used across different projects and regions. The goal of this custom tool, called CALC, is to provide more flexible and iterative environmental assessments during the design process, enabling the design team to make more sustainable decisions early and often.

You can read more about it in their handout from the digital catalogue class, but I’ll give you the key topics in a nutshell:

  1. The motivation behind developing a custom LCA Tool: Life-Cycle Analysis is pretty much a must, in international architecture, but Herzog & de Meuron found that existing LCA tools were insufficient for their needs, particularly because they often act as “black boxes,” which limit transparency and adaptability to specific project requirements. The firm faced challenges such as integrating different BIM standards, adapting to various local regulations, and handling the iterative nature of architectural design, which involves overlapping phases and ongoing revisions. They also needed a tool that could manage the diverse scales and typologies of their international projects.
  2. The solution was to develop a custom LCA pipeline designed to overcome these challenges by allowing for continuous LCA calculations throughout the design process, rather than relegating LCA to the project’s end. This, in theory, ensures that environmental performance considerations are embedded in decision-making from the earliest stages of design. The firm wanted a system that could handle the complexity of international practice while being transparent, adaptable, and capable of fostering knowledge-sharing across projects.
  3. The application CALC consists of two main modules: CALC BUILDER and CALC PROJECT. CALC BUILDER is used to create assemblies—combinations of building materials and components—that are applied to project-specific models in CALC PROJECT for LCA calculations. This system allows the design team to input and reuse data across projects, ensuring consistent assessments. Both modules are integrated into Herzog & de Meuron’s existing BIM workflow via Revit, and they rely on a centralized data repository that stores material performance data.
  4. Key features of CALC include real-time interaction with BIM models, flexibility in handling incomplete data, and the ability to track the environmental impact of different design options quickly. For instance, it can compare the carbon footprint of a wooden building versus a concrete building using the same geometry, helping designers make more informed, sustainable choices during the critical early phases of a project.
  5. The innovation of CALC is its use of a system-based approach, inspired by BIM’s Levels of Development, allowing the firm to perform LCA calculations at different levels, depending on the stage of the project. For early-stage models with limited data, the system uses predefined assemblies and systems (such as walls, floors, and roofs) that represent complex building components in simplified forms. This enables meaningful sustainability analysis even when only basic massing models are available. The system-based methodology provides several advantages: it enables rapid comparisons between different design options, allows LCA calculations to be performed with incomplete data, and helps build internal benchmarks by comparing different projects and design iterations. This approach also supports the firm’s goal of creating a unified framework for LCA across their global practice, despite the diversity of BIM standards and modelling practices.
  6. Company standards are also a big part of CALC. To ensure consistent LCA calculations across different projects, Herzog & de Meuron developed a mapping process that standardizes inputs from diverse models through grouping, filtering, and mapping model elements (such as columns, walls, and floors) to assemblies which are then used for LCA calculations. A configuration file defines how model elements should be categorized and mapped, and this configuration evolves as the project becomes more detailed.
  7. The pipeline is primarily used by specialists to create complex assemblies by selecting detailed Revit elements and retrieving material data from the firm’s material library. Calculations are performed on these assemblies, which are then archived in the central repository for use in future projects. The 3D geometry of the assemblies is stored in Speckle, and the assembly data is also visualized through custom dashboards in PowerBI, providing the design team with insights into both the environmental performance and the geometric attributes of the assemblies.
  8. The development was a collaborative effort involving the Analytics team, the Design Technologies team, BIM managers, and external industry peers. The firm also worked closely with other architectural practices and software companies to refine their ideas and ensure that the tool would meet the needs of all stakeholders involved in the LCA process.

The slides are available here.

 

Some other interesting takes this year involved Forma, a tool I can’t grow to appreciate and I’m sure it’s because of its similarities with many other tools that started out as interesting aids for sketching through analysis. One of these was London in Dubai: Simulating Climate Change with Autodesk Forma Software’s Microclimate Analysis by Giulia Pustorino. The main objective of the session was to explore how to simulate the future effects of extreme heat in urban environments using data-driven tools and it dived into the topic by simulating the conditions of London under the extreme temperatures of Dubai. The handout is over here. The concern is that urban centres, which already act as “heat islands” where temperatures are significantly higher than those surrounding rural areas, will be more vulnerable to climate change: materials are usually involved in this kind of analysis, but the session explores other factors too, like how the narrow roads of London are acting as “air traps” and how wellbeing for humans is created.

And if you think simulating London in Dubai’s condition is a bit extreme, which it is, the session also demonstrated a more plausible scenario, London’s climate evolving to resemble Madrid’s by 2050, and the main idea behind is simple: our design needs to be future-proofed.

Digital Innovation and Implementation Processes

Smoothing the Transition from Modeling to Prefabrication, by Kelli Lubeley at Cupertino Electric, covered some nice topics around connecting 3d models and (pre)fabrication in a way that was strongly focused on the process more than the tools.

The first step toward improving the transition is, of course, knowing where you stand, which means documenting the existing workflow. Understanding the current state of processes and tools is critical to identifying opportunities for improvement, and this can be done by focusing on three aspects:

  1. Communication: examine how data is shared between tools and teams, how coordination issues are addressed, and whether there are regular meetings for alignment.
  2. Content Library: investigate where content is stored, its accessibility, and the responsibility for content creation and maintenance.
  3. Deliverables: define install and prefab standards, ensuring they are easily accessible by all teams involved.

“You can’t really know where you are going
until you know where you have been.”
— Maya Angelou

The second step is gathering stakeholders. Prefabrication requires collaboration across multiple departments, including Project Management, Procurement, Production, BIM/VDC, the Fab Shop, and IT. Ensuring these stakeholders are involved in discussions about processes, tools, and improvements is critical to building a successful transition framework, and these discussions should be documented.

Adopting modelling is of course about improving efficiency (we don’t do it just for the sake of doing it), so the class highlighted a couple of tools to manage the proposed improvement:

  • a Difficulty/Importance Matrix. After documenting the current state, opportunities for improvement are prioritized using a Difficulty/Importance matrix. This helps determine which changes provide the highest return on investment (ROI). For instance, opportunities like more accurate Bills of Materials (BOMs) and standardized prefab drawings can be ranked according to their difficulty and importance, helping teams focus on high-value improvements.
  • the Roadmap: once opportunities are prioritized, the roadmap with clear milestones and deadlines is created to guide teams through the improvement process.

To improve efficiency in the transition from modelling to prefabrication, the document highlights the need for standardized tools and procedures, including:

  • Standardized Naming Conventions: this step is critical in high-communication environments to ensure that everyone across different departments understands assembly names, document titles, and deliverable processes. Sources like the ISO19650 provide examples for file naming convention and you must believe they work.
  • Updated Content Libraries: once naming conventions are standardized, content libraries should be updated to reflect these changes. A properly organized and accessible content library reduces the time spent on searching for components and ensures consistency across projects.

Training of course is crucial in the overall process, alongside easy access to documentation. Providing resources in multiple formats (written documents, videos) is key, in order to accommodate different learning styles. And these are principles we can take away for pretty much any implementation process.

Other notable mentions that might be worth looking into are:

Artificial Intelligence

A buzzword in the worst sense of the term, the thing was pervasive and mostly used in titles of classes that didn’t have any of it or, at least, weren’t explaining how their supposedly AI-assisted tools worked. There were notable exceptions to this, and these are the ones I either attended to or intercepted from the digital catalogue.

The first one featured AiCorb, and was titled Tackling the Designer’s Dilemma and Risks with AI-Powered Rapid Prototyping. Takuma Nakabayashi and Yoshito Tsuji from the Obayashi Corporation pushed their case for what they called AI-powered Rapid Prototyping, AiCorb addresses two significant dilemmas faced by architects in the early design stages, which occur when architects struggle to generate multiple design options under tight timelines and when clients delay decisions due to insufficient information about design feasibility.

The class breaks down this dilemma into:

  1. the Creativity Dilemma: architects face the challenge of exploring limitless creative possibilities within a limited time frame. They need to provide design options while still maintaining the quality and originality of each concept;
  2. the Efficiency Dilemma: evaluating multiple client-selected options without clear, timely feedback is another struggle, leading to delays in decision-making and inefficiencies in the workflow.

The early architectural design phase often creates a gap between architects and clients. Architects must provide numerous design options, and clients, often unclear about their needs, may delay making decisions. This results in inefficiencies, project delays, and additional rework. The primary challenge is balancing creativity and efficiency while ensuring that the client’s vision aligns with technical feasibility. AiCorb is introduced as an AI-powered tool designed to streamline precisely the early stages of architectural design by automating aspects of the creative process, generating multiple design options quickly and enabling better communication with clients. As advertised, it bridges the emotional and theoretical aspects of design, ensuring clients receive both creative and practical solutions early in the design process.

This solution promotes information sharing during the design phase, supports better decision making, and ultimately resolves dilemmas while reducing risks—all of which enhances client satisfaction.

AiCorb currently is comprised of two functions:

  • Façade Design, which generates diverse architectural ideas based on rough sketches and design prompts and can rapidly iterate them into options;
  • Modeling Function, which generates designs that are automatically transformed into parametric 3D models and enables quantitative analysis, allowing architects to assess various aspects of the design, such as environmental performance, cost, and material feasibility.

AiCorb addresses the communication gap between architects and clients by offering a platform integrating rapid prototyping with theoretical analysis. This dual approach allows clients to evaluate designs based not only on aesthetics but also on practical data such as costs and sustainability performance. The two dilemmas and the two functions are labeled based on the sphere they leverage upon and, in their words and not mine, are called:

  • Creativity and Emotion: Façade Design Function ensures that the emotional aspects of design, such as aesthetic appeal and personal vision, are preserved and communicated clearly to clients;
  • Data-Driven Insights: the 3D modelling feature provides detailed analyses of key performance metrics, helping clients make well-informed decisions based on quantitative data like energy efficiency, sunlight exposure, and cost estimations.

Their words. Not mine. I would have a lot to say about that but, hey, the tool looks cool. Their handout is here.

Another interesting session involving Artificial Intelligence was delivered by Abhishek Sanjay Shinde who, in his Optimizing Revit Structural Intelligent BIM models with LLM’s & Autodesk Platform Services, laid the basis and provided references for prompt engineering 101 with tips like:

The class then dived into how to retrieve data from a Revit model, using the PyRevit, C#/.NET Addin as you can see here, Rhino Inside Revit and Dynamo because Artificial Intelligence alone wasn’t enough.

The final aim of the class is to discuss how LLMs (like GPT-4 and BERT) are transforming design processes in the AEC industry, and how structural optimization might benefit from this: it introduces the idea of using LLMs to streamline workflows and suggests that LLM-powered rapid prototyping and BIM integration can improve efficiency and design quality.

The handout is here.

“The capacity to understand the world, understand the physical world, the ability to remember and retrieve things, persistent memory, the ability to reason, and the ability to plan. Those are four essential characteristics of intelligent systems or entities, humans, animals. LLMs can do none of those or they can only do them in a very primitive way and they don’t really understand the physical world. You should work on next-gen AI systems that lift the limitations of LLMs.”
— Yann LeCun in “AI And the Limits of Language”

AI-Nurtured MEP Modeling: Automated Patterns and Engineering Solutions was another content-rich class by Enrique Galicia, You can browse his awesome handout over here, his slides are over here, and in a nutshell these were the topics covered:

  1. AI Automation in MEP: AI can generate reusable MEP templates to ensure consistency and reduce manual errors. By using Dynamo and Revit API, repetitive tasks such as data extraction and pattern structuring can be automated, creating more efficient workflows;
  2. AI-Driven Route Optimization: efficient route planning is critical for MEP systems, and AI helps optimize system routing by analyzing configurations for efficiency. AI-driven tools and Dynamo can automate route planning, minimizing material waste and improving space usage, and this proactive optimization reduces trial and error and ensures better design outcomes;
  3. Proactive Clash Detection: traditional clash detection is reactive and time-consuming, often leading to rework, while AI could shift us to proactive clash detection by identifying potential issues early and resolving them based on predefined rules, reducing manual intervention, project delays, and costs;
  4. Avant Leap’s AI Solutions: Avant Leap, Galicia’s company, offers a range of AI tools designed for MEP modeling, focusing on automating route optimization and proactive clash detection. The tools integrate seamlessly with platforms like Revit, ensuring better project outcomes through automation and real-time decision-making.

Go check it out.

Now, I’m a fan of AI doing your laundry so you can do your art, and I believe in many fields it already does: the class by Dennis Goff, titled Artificial Intern: Let The Robots Get The Coffee!, seems to be along this line. Too bad the handout is literally filled with run-of-the-mill AI-generated pictures, and I’m not a fan of those. The content, anyway, seems good and might provide insight for people looking for in-house applications of Artificial Intelligence. At ZGF Architects, Goff’s company, these included:

  • ZGF.AI: a custom web portal providing access to various AI tools developed by ZGF, replacing commercial solutions like MidJourney and OpenAI with firm-specific applications. It offers flexibility and avoids licensing complexities;
  • Path of Travel Tool: originally a Dynamo tool, it automatically generates travel paths from rooms to egress doors and uses AI to identify circulation spaces based on room names, improving flexibility and usability across projects;
  • Name Standardization Tool: automates the standardization of view names in Revit by analyzing view names and returning properly formatted titles;
  • Ask ZiGFried: a semantic search tool embedded within Revit, allowing users to query building codes, standards, and project-specific documents without leaving the Revit environment, it simplifies searching for project-specific information through vector-based semantic search;
  • Ask Revit: a semantic search tool that queries a Revit model’s data (such as sheets and views) by creating a vector database of the model’s elements and returns results based on meaning, making it easier to find relevant model information;
  • BIM Doctor: a chat-based AI tool for diagnosing issues in Revit models, which allows BIM coordinators to communicate with the Revit model and quickly identify issues without the need for lengthy health checks or project reviews.

The class link is here and you can go from there.

Zaha Hadid Architects were also here to talk about Artificial Intelligence and you can (almost) always trust them to be atop of things: their class talked about merging structural topology generation tools with genAI across various architectural typologies into early-stage ideations and was titled Tectonics via AI at Zaha Hadid Architects, delivered by one of their associates, Vishu Bhooshan. The “tectonics” mentioned in the title is a concept crucial to the philosophy and aesthetics of their design process, particularly the relationship between the structural and aesthetic aspects of architecture. The class has no handout (I guess they were even busier than the Tesla guys), but the presentation talks about using pictures generated by Generative Adversarial Networks (which aren’t the Diffusion Models behind Midjourney) going into 2d animations and 3d models.

Diffusion Models are used, but with a different take than the regular “take this kitty, confuse this kitty, generate a new kitty that’s a rip-off from the previous kitty”. The related system is called CLIP: it pre-trains an image encoder to predict which images were paired with which texts in a dataset, and then the behaviour is used to turn CLIP into a zero-shot classifier.

Then it gets weird about the metaverse, but you can take a look by yourself at the slides here.

And of course another relevant class about Artificial Intelligence, touching upon the topics of ethics in its adoption and usage, was the one by Yael Netser and Michal Burshtein, but I already talked about it here.

Community Involvement and Participatory Planning

I wasn’t the only one discussing how to bring local communities, planners and other stakeholders together. Take a look at this amazing class by Jaqueline Pimentel and Marcio Augusto de Toledo Teixeira
from Curitiba’s urban planning agency (Curitiba being “the smartest city in the world”, at least according to the World Smart City Awards). The city employed this array of technologies:

  • LaBIM Curitiba: established in 2018, it integrates BIM methodologies into city planning, fostering more accurate and efficient project execution;
  • GeoCuritiba: an open platform providing geographic data, maps, and applications for public use, aiding in urban planning and community engagement;
  • 3D and 4D Modeling: used to engage the community on various projects like the Tarumã Overpass and Linear Park, enhancing transparency and communication by showcasing project impacts through 3D visuals and simulations;
  • Point Clouds and Volumetric Data: laser scanning and 3D modelling are employed to create accurate city maps, guiding decisions on infrastructure and land use;
  • Autodesk Docs: adopted as a Common Data Environment to streamline document management and ensure consistent communication across teams;
  • Autodesk Build: currently being integrated for construction quality management, streamlining workflows and improving team productivity.

When it comes to community engagement specifically, the handout highlights the use of 3D modelling to visually represent projects such as the Tarumã Overpass and the Linear Park. The models allowed the community to understand proposed changes better and actively engage in the planning process. For example, in the Linear Park project, the 3D visualization showed which trees would be preserved, alleviating residents’ concerns about the environmental impact.

Additionally, Curitiba integrated QR codes linking to the Autodesk Viewer platform, enabling the public to explore project details interactively. This digital engagement tool allows community members to access real-time project information, view plans, and explore construction proposals on their own devices, fostering a more inclusive approach to urban planning.

Their handout is here.

 

My class on Participatory Planning and LEGO Serious Play went well, and I’m really happy with the turn-up: it was interesting to see many people brought together by either LEGO, the wish to understand more about community engagement, or simply knowing they could count on me to deliver something weird. With the incredible amount of content happening at the same time, you all honour me. But most of all I’m happy I could deliver my last lines without crying, which was the real challenge. ❤

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.