I’ve read some past posts about this and it sounds like it’s still not possible, but I’m curious if anyone’s found any worthwhile tricks.
The problem I’m having is that I’m visualizing a site model, just the mesh, in a Speckle Viewer. And I have a texture that maps to its UV coordinates correctly, but I can’t seem to figure out how to assign a material that uses this image texture as its Diffuse map instead of a solid colour as its Diffuse map. Is it not possible at all? Is there any worthy workarounds? It’s just hard to give any context to this model without that texture.
We are already doing textures in the Speckle viewer. Its a bit of a hack but we are basically extending the SpeckleStandardMaterial to use the common threeJS material parameters and an adjusted shader.
The tricky part was it working properly with the hash and batching system. -I am pretty sure our solution has some bugs that become apparent in a more general usecase.
The other big topic about getting materials to work is the UV maps. I just assumed that we cannot access uvs in a reliable workflow given that the model might be coming from a plethora of different softwares. Our solution uses a custom tri-planar style uv mapping that works for most situations but can produce some weird artefacts in special cases.
Additionally roughness and normal maps work but everything breaks down if we get to floating-point precision territory since we skipped the fancy precision tricks that Speckle does.
I am happy to share some of the solutions, it is just quite entangled with other changes that we’ve made.
Also I can share some of my experiences and would be curious about the approach you are currently cooking up. What is the take on UVs? Are they/will they be always available in the display data?
I’m sure putting everything together was not easy!
We’re currently working with the premise that “you get what you put in” As in, if your UVs are bad, it’s going to look bad. For now we are not planning on “fixing” UVs on the fly or even generating any if they are missing.
We are and will be storing uvs along other vertex attributes
We would be very happy if you shared whatever you feel like! We don’t get many chances to discuss custom user implementations, so it’s very exciting when we do get the chance!
As for our take, we currently have a running prototype where base color textures are supported only, but it’s valid for any other textures that three.js already supports in it’s default PBR workflow, which we currently still are working off of. However three.js has (always had) this tendency of refusing to cooperate on the more custom stuff that we had to do and that is where most of the pain points lie.
We’re also decoding textures in workers and applying textures to objects as they are ready. One thing we want to avoid with textures, is adding to the load time.
What we currently have is going to be enough for most cases. In case things get hairy we’ll have to switch it up a bit and implement texturing on a per batch basis, where we bind texture arrays and dynamically index into them in the fragment program (nothing ironed out, just ideas)
Thanks for the insights, that sounds reasonable!
I’ve done a quick cherrypick to reduce it to only the changes that touch the texture system. Obviously in this state it wouldn’t really work but might give some perspective on our approach.
Our usecase is a little bit different since we can’t really rely on the uv’s being there in the first place. Also we are not dealing with any textures stored in the data. Instead we pull textures from a separate database based on the render materials and rendermaterial proxies.
Our biggest headache is currently dealing with the batch system since we also want to assign new materials to parts on the fly. Our approach works for now but i feel like there is a few sleeper bugs in there. Especially since previously the batcher used the color based hashes to merge identical materials which is a bit more complicated with textured material and i didn’t fully understand when and where they are relevant.
Thank you for sharing! The approach you took is similar to what we currently have going with a few exceptions. For the moment, in order to keep things simple we compute material hashes based on texture presence which means that batches will be split based of material textures alonside the rest of the existing properties
If you are asking where material hashes are relevant, they determine how objects split into batches initially. Objects with identical materials will generally be batched together except for special cases which involve some instancing conditions, and other limits we impose. But yeah, adding textures in this mix isn’t straightforward. Our choice so far, simplifies working with the batch systems but it’s not optimal performance wise
The batching system currently ignores the possibility of adding textures, but we are currently looking at changing that.
I’m open to discussing this (and any other topic) further, so feel free to post here whenever you feel the need to