Using Speckle Transports/Serialization to transfer data to a custom endpoint

Dear Speckle founders, employees, and community,

I’m Håkon and I am currently prototyping a Revit<->Spacemaker synchronization plugin on behalf of Spacemaker/Autodesk. Rather than reinventing the wheel, we are trying to use some of the lower level components of your libraries to get a head start with data modeling and serialization. As a bonus, you have solved many of the cross-compiling issues with different Revit versions.

We are truly impressed about what we’ve seen so far!

Our idea is to do something as the following:

            var commitObject = new Base();
            foreach (var o in speckleBases)
                // Detachable property, yes please !!
                commitObject["@" + o.GetId()] = o;
            var (s, t) = Operations.GetSerializerInstance();
            s.WriteTransports = new List<ITransport>() { new MemoryTransport() };
            var obj = JsonConvert.SerializeObject(commitObject, t);

However, the result of this doesn’t include much data, only one reference per selected object:

    "@6c737a551f714a07cd828025424d7fd7": {
        "speckle_type": "reference",
        "referencedId": "4d6f6c3f9163ccbb8e47d8003e2bd425"
    "@c14e446aff8e3a4c5d3b737521ef8664": {
        "speckle_type": "reference",
        "referencedId": "3358420a91a48335122a9fec243a6d6c"
    "totalChildrenCount": 0,
    "speckle_type": "Base",
    "__closure": {
        "a61250e40666ef7af5cf0367b4bd546a": 3,
        "37d61eda7e2cfff131f3f74d71ab08d0": 3,
        "be277d7552821f8ca9f24809837b6541": 2,
        "4d6f6c3f9163ccbb8e47d8003e2bd425": 1,
        "3644f78bc4ef1db14d5080951da66c36": 3,
        "07a6171f07039b014b792d195ad45c99": 3,
        "018a54b4d8e91c681d5bb1fac8e33544": 2,
        "3358420a91a48335122a9fec243a6d6c": 1
    "id": "d4a43160b480f6d7535b6195b0ebf9fe"

If we exclude the @ for those references, all data is included, but none of the elements in the tree are detached. We are looking for the meshes to turn them into glTF and view them in a web viewer, but we would have to recursively search for them.

What we have been looking for is a way to serialize the selection in the way that all detachable objects are included at root level, just like your object/ API. This way we can scan the list for types we recognize and display them, and ignore the rest.

Can anyone help me point out where our approach is wrong?

  1. Should we not make the root items detachable and recursively search for what we want in the tree?
  2. Is there a way to include the data that the IDs in the __closure is pointing to?
  3. Should we start thinking about our problem differently?
    (4. What is the best way to construct a commitObject? The GetId() serializes the object)

Thank you again!


Actually, when I removed the @ I see that it still references data that is not included in the serialization. See here: wall-floor.json · GitHub

Hi @hawkaa!

I’m trying to wrap my head around the specific usecase - i might get it wrong, but let’s see.

One gist: when you pass in a transport to the serialiser, it will “store and forget” any detachable objects. The references you are looking for are in the transport!

Some other thoughts:

The easiest would be to just drop your speckleBases collection in a detachable list property. That can be done as easy as

commitObj["@allTheObjects"] = speckleBases;

alternatively, ignore the id itself and just add keys

var k = 0;
foreach (var o in speckleBases)
  // Detachable property, yes please !!
  commitObject[$"Object-{k++}"] = o;

When seding to a transport, you can also ignore quite a few things and just use Operations.Send(). Off the top of my head:

var memoryTransport = new MemoryTransport();
var objId = Operations.Send( commitObj, new List<ITransport>() { memoryTransport } );

This sounds like a bit what the current threejs converter does. This part is quite tied in the way the server streams objects back to the clients, so it’s not portable to .NET. For the processing, are you thinking to use .NET?

Ha, this is because displayMeshes are detached by default (controlled at the object model level).


Thank you so much for your response @dimitrie !

I ended up with this:

            var commitObject = new Base();
            commitObject["@all"] = speckleBases;
            var memoryTransport = new MemoryTransport();
            var objectId = Task.Run(async () => await Operations.Send(commitObject, new List<ITransport>() { memoryTransport })).Result;

Does it look roughly alright? I guess we can implement a custom transport at some point which would be really cool, but this gets us off to a good start I think.

I was really unclear about what I am trying to do. I am sorry about that. I am basically trying to transfer (POST) data from Revit to our APIs and wanted to use the Speckle abstractions to do so. This way we have as little logic as possible in the plugin(s) and can do much of the data processing in our services (TypeScript + Python). Turning some of it into glTF for our users to see is just one of many use cases. We also use glTF as the least common denominator for our geometry processing pipelines.

Let me know if you have any input!

1 Like

Gotcha! Code looks all right.

This definitively then screams for a custom transport! We’re :100:% happy to help (even under some sort of an NDA, as we discussed before IRL). You could be pushing objects to your endpoint, which then would be able to process them linearly as you see fit.

To deal/process objects, there’s usually two ways we do it in Speckle: one by one, as they get streamed - this is really memory efficient, as you can imagine; and the “composed” way, but that’s usually best avoided as you have to load up the whole “cart and horses” in memory. But one step at a time!

Thank you!

Implementing a custom transport makes much sense, indeed. However, for now I’ll continue prototyping a little more on our backends and see whether we can put the data to use.

Super stoked that you guys are willing to help with that. I don’t think an NDA would be necessary. I think we would be looking to create something in-between the code I’ve written above and the speckle server. Unsure whether we should be working on the problem from this angle or using the speckle server as a base for further development.

1 Like

I think the server might give you a leg up in some regards, specifically a tested api for storing and retrieving objects for down the line processing in various services.

It might also be a burden on the authn & z sides, as the whole resource based access control is for sure different than the Spacemaker one, and possibly a hassle to make them happy together. Most likely some sort of solution where Spacemaker acts as a proxy (&auth) towards a non-internet exposed Speckle Server, and keeping track internally (outside the speckle server) of who created what. I might be totally off track - it’s late and i’m offtopic :smiley: