Discussion: A Navisworks Object Model

I’m hoping the great and the good of the Speckleverse can help out my full brain :brain: and I am seeking any thoughts around the various options for the Objects.Navis object model.

All Navisworks models are an aggregation of n models which are represented as a hierarchical tree :deciduous_tree: File > Level > Element > Component > Geometry being the flattest, most basic example.

Each of these nodes is a List of that type. Different sources will populate the Model data attached to any of these nodes and it will vary by file type (e.g. Revit) and which node it applies it to. In order to aggregate many formats of the source models, the hierarchy will be different by type, reflecting how best to maintain informational and geometrical data completeness in a logical inheritance. If you imagine Type/Instance data being present on parents and children of a nested family, this may make sense to you. What may not is that depending on element type this may be arbitrarily applied at the Category, Type OR geometry level.

A closer look at a real-world example shows that the geometry of the model may be at inconsistent depths depending on the source. the :ice_cube:s indicate a geometry node.

When the source is aggregated models from different authors, file sources and formats the resultant tree is, while logically consistent, difficult to manage as a data structure.

I have sketched out 4 possible Speckle object structures to represent the information in a given selection of these model elements in a commit. Each has Pros and Cons which I’m still thinking about. The first is what the Rimshot App finished the hackathon with. Stream attached.

The Tree-Object model (as currently implemented)

This is the implementation using Base objects of this nested tree. Each node is a Base object (Element) that in turn contains @Elements[] with each node being otherwise generically the same until the final ‘leaf’ node which is ALWAYS a geometry node.


  • Maintains the parity with the source information
  • No duplicate Elements
  • Filtering by properties cascades to include/exclude descendant nodes


  • The nesting of Elements is unknowable and requires recursion
  • The geometry nodes are potentially at many arbitrary depths (e.g. v. difficult to handle generically in grasshopper)

    attempting to map parent element property lists with child element geometry nodes from a quite simple demo file

The Reverse-Tree-Object model

This starts with all Geometry nodes at the root and recursively records Ancestor nodes as a tree of geometryless Element objects


  • Position of geometry is consistent and knowable
  • Ancestor data is retrievable is required


  • Accessing ancestor data properties requires recursion
  • Potential for duplicate Ancestors (obviated by detaching)

Ancestry-Element-List model

This replaces the object tree of ancestors with a List ordered by the depth of the ancestor node in the tree.


  • Position of geometry is consistent and knowable
  • Potential for duplicate Ancestors elements (obviated by detaching)
  • The parity of each ancestor list item and its source node will be lost as compared between geometry objects.
  • Accessing ancestor data properties requires looping

Ancestry-Property-List model

Instead of ancestor nodes in a list, the properties of the geometry are populated as lists for each category:property_name ordered by the depth of the ancestor node in the tree


  • All data preserved for a Geometry leaf node.
  • Filtering by properties is straightforward (maybe not by the viewer)
  • No duplicate objects (property values may be duplicated but I’m assuming detaching is unnecessary)


  • As ancestors may or may not have particular properties, the parity of each list item and its source node will be lost

Heya @jsdbroughton , on a pretty related note, we’re currently in the process of scoping how Speckle manages Containers to assist with model organization, and working with groupings like Rhino/AutoCAD layers or Blender collections easier. This in particular would have ramifications for a Navis connector where federated models & files need to be created in reverse tree traversal order.

We’re currently considering 2 options, ordered by preference:

  1. A new Container base class with a name and elements prop. These can have application-specific dynamic props attached to them as well to account for things like layer visibility, display style, etc.

  2. A flag on Base indicating that it is to be treated conceptually as a Container

Container trees could be included on a new ModelInfo class as well to capture file structure regardless of objects sent in a commit.

Would have to ping @AlanRynne to see if he has thoughts about how commit traversal may or may not change in GH as a result of this, but mostly giving you a heads up that it may make sense for you to abstract any non-geometric objects as Containers in Navis if we do move forward with implementing this in our 2.7.0 release.

Also, if you have any thoughts or feedback on a Container class (especially with respect to Navis), we’d love to hear!


Option 1 Container class sounds similar enough to what I have already implemented with only a little special sauce. While representing the full aggregated model or aggregations of aggregations of models this would be the ‘maximum parity’ option.

What I am finding on receiving the data was enough of a contra argument that I was motivated to look for alternatives to it. This is driven by the resultant disassociation of datasets from geometry.

I’m going to implement Option 4 from above and compare. It’s a bit “horses for courses” as to what is the ideal outcome - which is why was opening the floor.

I think there may be an interesting middle ground with what you have suggested, in that the conversions I have concentrated on to date are Selection based so the root node of the selection wouldn’t necessarily be the Container root. Looking at the reverse-tree model, this would in fact reveal ancestor nodes related to Typed Containers (e.g. Files & Layers)

As a consumer, I’d then be hoping for the heavy lifting of propagating nested Collection datasets to lead nodes to all the connectors.


Also, if you have any thoughts or feedback on a Container class (especially with respect to Navis), we’d love to hear!

… I was hoping for more answers than to be set homework assignments

1 Like

I think that the “Tree-Object model” is the most intuitive one, and also what people are already used to: more or less what Speckle already does, and the same of Navis and other web-based viewers.

All our connectors are already set to handle conversion recursively and so it should not be a problem if geometry gets nested at different levels.
The new additional Base classes, mentioned by @clrkng, combined maybe with a set of helper methods (and nodes in GH) could make working with such a hierarchical structure simpler too.

1 Like

So, the aggregation of each layer’s datasets to the geometry node ideally should be left to the connectors / receiver?

The commented objects on:

Demonstrate a problem I’m encountering. In the source Revit file, there is a single property applied to both elements which by the time Navis has mucked about with them is applied at a different branching level in the tree. The viewer renders the filter colour correctly, but the object data is inconsistent.

1 Like

Ah! Can you send a screenshot of what that relationship/property looks like in Navis?

The big momma of a tank

The Revit properties in this instance are baked into the Geometry Node which is:
Family > Type > Family:Type

The small tank (which makes up 50% of the mesh count for that model - embedded BIM objects

Here the same properties (So far as Revit cares) are at a similar branch level on the Composite Object:
Family > Type > Family:Type

The geometry nodes are still some way below that Element properties level and not at the same depth.

Those lovely Russian :ru: valves are:
Family > Type > Family:Type > InsertGroup > CompositeObject > InsertGeometry > Geometry

If there are specific data needs on reading (e.g. Asset Information Model validation) these can be accommodated and laced back together but not in a generic way and looks like the above spaghetti in GH. :spaghetti: :spider_web:

1 Like

I have progressed thoughts and implementation to an improved state and have a few steps left to do to improve it further.

Essentially this is to take an opinionated stance that the Elements with their data (or Element Assemblies) are the key data point and that the structure of the actual Navis file isn’t important. @clrkng the opinionated position here is to delay implementing collections above the level of composite objects. This could follow later.

Also since the last update, I have improved the geometry conversion a bunch. It’s now more lightweight and removes some of the remnant duplicate instance cruft that the API leaves in place.

Concentrating on fidelity of conversion and preservation of elements

Phase 1 - Denormalised Geometry Stacks (implemented example at the bottom

The geometry work happens to leave me with a flat list of all the Geometry nodes. This data iteration takes this atomised list and each commit has a Base element for each geometry but with a descendant, nested, hierarchy starting from the first meaningful data object (determined by my criteria) and ending with the geometry node.

Rationalised Properties Sets

Phase 2. - Coalesced Composite Geometry Stacks

Next step will be to de-dupe where the Meaningful root nodes are shared by multiple geometries and coalesce.

Phase 3
This is the planned future development where the UI will allow for user selection of Properties much like the Navisworks Quick Properties (indeed this may be the interface to piggyback). These selected properties will then be added to the root and leaf of each element in a commit. This allows for easier downstream analysis.

Categorised by a selected property propagated through the object model

Having used Navis for years now and wrestled with the Object Model from a Data audit point of view, the above resembles most closely lessons learned from an ETL point of view.