I am trying to send some rhino meshes and converting them to Revit topography via grasshopper but in a compute environment. I am fully aware that the Revit speckle components do not work in a headless environment. But I tried sending a mesh through and even though it returned with 500 error during the post and also showed an error in the logs, the actual mesh got sent. I want to note this was only possible when I manually opened up the compute.geometry in my ec2 instance, and it would not work when I just tried to run it.
I was wondering if there is a trick to this, where I wouldn’t have to manually open up the compute.geometry instance to get this working. I’ll attach the python code where I read a rhino file and send the geometry and also the ghx for compute.
rhFile = rh.File3dm.Read('./test.3dm')
layers = rhFile.Layers
topo = []
for obj in rhFile.Objects:
layer_index = obj.Attributes.LayerIndex
if layers[layer_index].Name == "Topography":
topo.append(obj)
topo_meshes = [obj.Geometry for obj in topo]
topo_to_send = [{"ParamName": "Mesh", "InnerTree": {}}]
for i, mesh in enumerate(topo_meshes):
serialized_mesh = json.dumps(mesh, cls=__Rhino3dmEncoder)
key = f"{{{i};0}}"
value = [
{
"type": "Rhino.Geometry.Mesh",
"data": serialized_mesh
}
]
topo_to_send[0]["InnerTree"][key] = value
gh_decoded = encode_ghx_file('./test.ghx')
geo_payload = {
"algo": gh_decoded,
"pointer": None,
"values": topo_to_send
}
res = requests.post(compute_url + "grasshopper", json=geo_payload, headers=headers)
Ok noob mistake from my end, I forgot a context print or RH_OUT. Now there is no errors but its still the same problem. It runs fine when I send a post request and I have compute.geometry in my EC2 instance manually opened up, but doesn’t run when I want to send a post request normally.
Also, I can see the topography comes out different when I run it through compute vs through grasshopper.
What do you mean by “doesn’t run”? Does your compute instance not get the post request at all? Does it fail to start running the Grasshopper file, or does it fail during the run?
How is your EC2 instance authenticated in Speckle? Did you add an account in the EC2 instance with access to that stream?
Screenshots of your Rhino.Compute console would be helpful.
This is a scaling issue. Do take into account that Grasshopper is a unit-less modelling program. Meaning, the numbers you use for modelling only make sense in the context of a Rhino document (which does have modelling units, and are the ones assumed by Grasshopper)
i.e if your Rhino doc is in inches, a sphere of radius 1 in Grasshopper would have 1in radius. Change your modelling units to meters, and that same GH file now outputs a sphere of 1m radius.
This is what you’re experiencing in those screenshots. I’m guessing the original mesh was in meters, but your default unit in Compute is mm
You can now what units Speckle nodes will be using because they’re printed into the console every time a component fetches the doc (a bit overkill… I know)
If you need to specify the modelling units in a Grasshopper Compute run, we suggest either:
Changing the default units of your EC2 Rhino instance so that it matches your defaults.
Use a python/c# script to create a new headless doc and assign it to rhino’s ActiveDoc. You can modify the doc’s units based on a fixed desired unit or an input.
So no, a commit does not get created. I output the string of the stream in the grasshopper script but when I print it, its empty meaning it never actually creates a commit.
Also, the 401 errors might relate to the speckle components, I suspect it is most likely an authentication error with speckle. There arent any other grasshopper components besides the speckle ones.
I figured out what was going wrong, somehow even though I had downloaded speckle on the ec2 instance, the grasshopper components were not there, this was causing a missing components error. I redownloaded the manager + the connectors and restarted my instance, that worked.