How to write to a transport

I was able to use the local send/receive but I’m trying to understand to to use a transport in that process. I can create them but can’t figure out how to write data to them?




Hi @Greg_Schleusner! I wish this was a simple answer, but there’s some todos on our end. Writing is not a problem:

Just connect it to a sender! (PS: you can connect multiple transports in there, inc. stream urls, to push simultaneously to all of them).

Receiving is a problem though. I saw @AlanRynne typing, so I’ll let him deliver the bad news :grimacing:

1 Like

Hey @Greg_Schleusner!

Let me start by welcoming to our community! Feel free to Introduce yourself 🙆 if you feel like it :grin:

As for the Receiving end of things… as Dimitrie said, there’s some work pending on our side to allow the Receiver node in grasshopper to understand what a transport is and how to deal with it.

This is part of a pending architecture change in how transports work throughout Speckle, so we’d be happy to hear any insight or use case you may have in mind.

The local send/receive nodes work exclusively with the local SQLite DB inside your Speckle folder in %appdata%, but nothing else.

If you’re keen on the technical details, you can find a longer explanation here :point_down:t3:


Thanks, I’m just testing at this point… Dimitrie pointed it out so I thought I’d take a look.

Unrelated question? Could a transport be a serialization in the future based on how its architected? 3DM, DWG, IFC etc?

…knowing that if it was more then geometry it might not be a complete representation?




Re unrelated question: serialisation currently is centralised, as Speckle needs to control the whole business of splitting up the object graph into immutable chunks (the decomposition api story). The architecture base is though reflected in the Kits story - so that’s where I would see 3DM, DWG, IFC being reflected.

We are actually working now on a service to parse IFC files into Speckle; with later plans on allowing the reverse to happen too if it will be needed. We see them as “upload X file to Speckle”, and potentially down the line, “download this commit as X”.

I also think I can probably hack in 10 mins a c# script that does transports properly in Gh. Let’s see if I meet my deadline :smiley:


Thanks for the info. I think a centralized only model for this will be problematic in the long run. For a lot of workflows where the other app that wants an IFC or Rhino file as example is sitting next to you on the network or on your computer. Imagine you have a college that has the Column model in a Rhino work session (linked) and I want to send updates of those columns to a Rhino model sitting on the network and to someone in Revit at the same time. The Rhino user would have to go to a web page and download a new version manually vs just linking from a folder and getting the best of rhino’s ability to just reload. This points to workflows where the point of sending the data is not for the user to migrate it but simply to reference it. An architect wants the structural engineers columns but not in the file they are in… but in something they can reference. Most actual collaboration in AEC is for reference not for editing in my estimation.

Same if you want you have a collogue working in Solibri that just wants to get updates during a design review. Person A Running design tool streaming changes, Person B navigating the model in Solibri. Currently the Solibri User can just turn on Autor fresh and get updates if you point the file at a network location.

This is one of the best things about GIT…works well on the desktop and in the web not one or the other.

This is preachy and its not meant to be…I just want to point out how much work doesn’t look like stream sender/receiver and that we shouldn’t have to give up the best things about files and networks to get the great benefits of the web… to me its a false choice that we shouldn’t have to make.

The obvious solution to me is a “client” desktop or “client server” service that can ingest these streams and produce these outputs in traditional workflows and keep them up to date. Happy to describe this in detail if there is interest.

Thanks for listening to my diatribe

1 Like

Hey, all good thoughts in here! I though think there’s a misunderstanding on how Speckle works; and this is probably our fault in communicating things properly.

Nope! The Speckle Rhino Connector would display a live notification that stuff changed, and prompt you whether you want to update your model. I’m dredging a screenshot/gif right now (thx @AlanRynne )!


(and here you can see already some extra nice hints you don’t usually get: who updated what, on what branch)

Agreed 100% on this one!

Speckle connectors literally do just that.

Again, we agree 100% here - and Speckle literally allows you a ton of flexibility in this regard. This is an important point:

  1. You can literally push data to any transports you want to. These transports can interface with, for example, a server. Literally like git, you can have multiple remotes. Some of these remotes can be local, or on a network somewhere, in a cloud, wherever!

  2. Re files: we have a file transport that basically… you guessed it, writes files! Where it writes them, again up to you!

  3. You can have a transport writing directly to a database. Again, that database can be local/remote/wherever!


Sounds like its all there… thanks for the detailed response.


For the sake of posterity, we’ve now implemented two quick Grasshopper components that allow you to send and receive directly from a given transport - once this PR is merged, they will make it in the mainline on our next release.