Those who attended last week’s community standup know my face already, I just started my master’s thesis …on Speckle!
More precisely, I will be looking into the state of automation in AEC, and how an automation platform on top of Speckle could look like. (And I’ll keep you updated here, so if automation is your jam, be sure to keep an eye out for new posts!)
Tools & Concepts
To start things off, I’d like to get a general idea of the spread of different tools and concepts in the more tech-forward pockets of AEC. What tools are you familiar with? Which ones are actually part of your day-to-day workflows?
This list is necessarily incomplete. Miss something? Let me know in the comments! (Or if you see it mentioned already, give that post a !)
And now for the fun part: If you found your way to the forums, chances are you have your occasional “Dang, I wish I there was an easy way to automate that task!” moments. (And make ‘easy’ whatever you want it to be here) I’d be interested to hear about them!
You’re already automating parts of your workflows with Speckle? Cool! Let me know about these, too!
We’ll be keeping things intentionally open for now, so be as specific or pie-in-the-sky as like.
Thanks for tagging @dimitrie , this is super interesting.
I am actually finishing my master thesis in 2 months around this very same problem, so if you want to reach out @messismore, I am sure we can bounce ideas back and forth.
The problem raised from my work with rhino.compute at Schimidt Hammer Lassen Architects which is an affiliate to the PerkinsWill network of studios. We have a massive library of scripts and automated processes (.gh, .dyn, .cs, .py). But, they are really hard to maintain, they usually require an expert to run, and loads of dependencies to be deployed on my colleagues machines.
We started tackling the .gh problems with rhino.compute. But, what happens if we want to support other executors, say something to run .dyn scripts?
I got truly inspired by how CircleCI and Github Actions have a runs-on: node:alpine. And I am proposing a framework that will hopefully support a collection of executors that run specific classes of jobs.
Now for the actual use case, with such a framework, we hope to:
Orchestrate a collection of scripts based on their dependencies (i/o)
Mix and match scripts of different executor types (e.g. .gh + .dyn)
I’ve got the dependencies/orchestration part kind of sorted for my prototype, and now I am struggling a bit with data communication between the scripts. I could push .json from and to a database, but I am trying hard to make Speckle my data service, so that I get all awesomeness that’s there (commits history, branches, webhooks and connectors).
wow that sounds super interesting! Sounds like we’re motivated by the same pain
As I wrote in my post, I’m only getting started and am still pretty much in the process of learning what the most meaningful contribution might entail and how to scope it.
But I’d love to get in touch whenever you have a minute
I added a vote for using AWS lambda, but most of my automation is GCP based using either CloudRun or Cloud Functions (like lambdas) or Data Flow for more ETL stuff. These are strung together either with webhooks, pub/sub or scheduled tasks. In fact, I’ve never used any AWS.