Hi,
I am starting to dive into automate and its systems.
My function is supposed to enrich multiple elements with additional metadata.
Given its additive nature I am able to do this either via an annotation or via a new model version.
From the documentation it seems that creating a new version is the recommended.
Later on i want to use the generated data in a custom speckle viewer.
New Model version
Unfortunately I get a GraphQLException:
pecklepy.logging.exceptions.GraphQLException: GraphQLException: Failed to execute the GraphQL model request. Errors: [{'message': 'Failed to find branch with id new_matching.', ....
I am calling the “create_new_version_in_project”
automate_context.create_new_version_in_project(version_root_object, "new_matching", "message?")
It seems a bit weird that it is complaining about not finding the version that I want to create. If I add the existing version it complains that the target model can’t be the model that triggered the automation.
I am on specklepy 3.0.0, running the automation locally as a test function.
Annotations
Intuitively I would have preferred to add info to the objects instead of creating a duplicate of the whole thing. It seems a bit wasteful if I just associate additive metadata but it depends on how the backend handles versions anyways so I will go with the recommended new model version approach if possible. (insights welcome )
Regardless: With annotations I was able to associate the data to the correct elements but I was a bit confused how the automation data is stored. If I query the data via graphql there are multiple lists. I would have expected the lists to contain data from previous runs. My goal here is to reuse computations if the input data didn’t change. In “model/automationsStatus/automationRuns” i.e. I would have expected multiple entries listing the automation runs. Instead I get two entries with the second not having any data. I would have expected previous runs either here or in the “functionRuns” list further down.
The result data is wiped when requesting a new run so it is not possible to retrieve previous run annotations in a new automation run.
At least with my approach. Maybe you can give me some hints where to look?
TLDR:
- Creating a new model version from python automate fails
- Is it possible to get automation annotation outputs from previous runs?