Hello lovely people,
I’ve been diving into Speckle, and so far it’s awesome. I managed to send and receive different types of data to my streams using applications and their connectors. I also managed to receive and send those objects using a Python script. Right now I’m trying to combine different objects in Python (experimenting with detached properties, and stuff like that) from different streams into a new stream.
Basically, this worked. The thing is, one of the objects is a 2D GIS map, which I grabbed from an api and put into one of my streams. The other object is a 3D object (basically a house), which I put into another stream. I retrieved both objects in my Python script, combined the two into one Base object, and committed this to a new stream.
In this new stream, I can indeed see the two combined objects in the data view. The 3D viewer, however, renders the GIS data, but very far away. This GIS data is basically a street plan of an area of a square kilometer, and contains coordinates. It is my guess that my 3D object is also present, but very small compared to the GIS data, and probably placed at completely different coordinates.
My question is: can I (programmatically) alter my 3D object so that its x and y coordinates fall within the plane of the GIS data? I’ve seen a tutorial in which this is done using Rhino, QGIS and Grasshopper, I was just wondering if I could do this with Python.
I really hope this is a somewhat coherent story. Maybe I’m approaching this the wrong way, but I’m still in the process of getting a basic grasp of Speckle.
Thanks in advance!
edit: I’ve seen that the GIS data contains a ‘crs’ field, which is apparantly a Speckle object. I’ve tried transferring this field into the 3D object, but it gave no result.
Hey Kevin. You are throwing yourself straight into the deep end.
What you describe should be possible. From what I am reading, you have 2 separate questions, albeit you have answered one yourself. So I’ll answer that first with a question.
What are you trying to combine streams for as a high-level macro task? It is possible, as you’ve seen, but it might not always be necessary. Downstream applications can separately subscribe/receive multiple streams. This can often be a good way of segregating work.
The second point is more straightforwardly likely to be simply a distance from the 0,0 origin issue. GIS data is quite often in “real-world” coordinate space, and the software is designed to cope with the large numbers that imply. 3D authoring packages (you don’t say how the house was generated) tend to have internal mechanisms to author as close to the model origin as possible to make the math easier.
Could you share the Stream with us (it can be a private message if you prefer)
The embedded CRS property is embedding in the GIS file what projection of mapping is used (different maps exist to translate the globe to a flat surface).
Depending on the sources you are using, there are two ways to tackle this.
In Revit Connector (I’m making an assumption here) you can send to Speckle using a different origin that has been set to translate the model to real-world coordinates. This would bring that model => to the GIS.
In the GIS connectors, you can do the inverse and set an artificial projection that translates a nominated coordinate to be the origin and send the GIS CAD => the model.
Doing all that in python IS possible, but the geometric definitions are manifold in a Speckle object. You would have to apply the same transformation to the BoundingBoxes, all the base geometry and then Mesh Geometry of every object. Much easier to author consistently and pick which coordinate space the models will live in
Thank you so much for your elaborate answer!
For context: I’m a student software developer, and I’m interested in working with geodata and app development. I am, by no means, an expert in the GIS or 3D modeling field. Hence my question; how much can be done using the SDK?
As for the workflow, the GIS data is provided for free by my government, and can be edited by open source software (I’ve installed QGIS), so that’s fairly easy. However, the 3D part is trickier for me. The sample model I used indeed originates from Revit, but I have no access to that application. Many applications are way too expensive, and/or offer no free trials. For uploading to my Stream, I used a combination of Blender and Rhino, of which the latter will soon expire.
I guess what I’m aiming for is combining GIS data of my neighbourhood, adding some models on top of it, and sending the combined output to Unity. One of the layers in the GIS data is land register data, so basically points where a house address is registered. I thought it would maybe possible to put some 100 3D objects centered on those points, as a first test, using Python and loops.
I’m still in the exploring phase, so I’m also just trying to see what Speckle is capable of.
Thanks for your advice, I’ll go and fiddle with it some more.
ps: this is one of my test commits: Speckle
The shorter answer is that changes to geometry objects is, currently, best done in software than direct manipulation.
Have you tried Blender?
I’m reusing this thread for a follow-up question if that’s okay!
So after the helpful tips of Jonathon, I’ve indeed focused on working with coordinates in the applications. I’ve followed a few tutorials, and managed to add a georeference to my model (using the BlenderBIM plugin). This model was imported as an IFC4 file into Blender.
However, when I upload my model using the connector, it appears that no geodata has been included. Should this be possible from Blender? Or did I do something wrong? I’ve been struggling with this for a few days now.
I also managed to get a student license for Revit, and I tried sending the model from there, but this gives the same result.
This is a commit I’ve done from Blender: Speckle
Any help would be greatly appreciated!
Hey @kevinpease, nice to see you persevering, and I commiserate you don’t see the results you want.
Where do you want all these combined models to end up? Both in terms of what software and which location.
I’m not familiar with the metadata handling in Blender Connector, but do you want to send Blender geometry to QGIS?
If you want to end up in Blender and then the viewer, I think your easiest first step is to export from QGIS with the artificial CRS, which will map that to a known point near zero.
If you are determined to end up in real-world space with your 3D data, the Revit route should have worked for you. The trick is specifying a point in the Revit model as a known coordinate and setting that as the Project Basepoint with the real-world coordinates. On export from Revit, under Advanced Settings, you select export to Project Basepoint.
All this is tricky. When I have some moments in the New Year, I will look at some of this GIS interop and get it into our tutorials.
Thank you for your quick answer! I want the data to end up in Speckle, and eventually in a Unity app.
I followed your tip, by setting the GIS data to zero, and this way, it combined neatly with my 3d model. So now I want to move the model to its real-world coordinates. As a demo, I was aiming to combine everything in one commit, so that I have a basic representation.
On export from Revit, under Advanced Settings, you select export to Project Basepoint.
I haven’t tried this yet, will fiddle with it some more. Today is my last day before the holidays, so I’ll dive into it deeper next year.
Again, thank you so much for help, happy holidays!