Presumably, these mirror the same numbers in Objects.Geometry.Pointcloud.
I was keen to learn if this was following best practice, lesson learned or just guesswork; the x, 2x and 2x seems significant, yet 31250 isn’t divisible by 9 or 3, so it is not about validity…
Supplementary, what is the ‘sizes’ property. It wasn’t used in the hackathon demo.
Postgres and LAS both have a 4th dimension for Intensity, or is it intended as a size of point represented in the viewer?
Chunking large lists is used to ensure any single objects doesn’t get too big,
Mostly for network transport performance reasons. The exact number of items in a chunk doesn’t matter hugely.
I think those exact numbers, were somewhat arbitrarily chosen.
Each chunk has some overhead, so ideally you want quite big chunks.
But you also want most objects to be well under the hard 10Mb limit of a single object. Otherwise you end up with HTTP requests that are huge and potentially unstable.
The significance of those exact numbers are:
62500 × 32bit = 2Mb (signed integer = 32bit)
31250 × 64bit = 2Mb (double precision float = 64bit)
I would suggest keeping to numbers roughly around this 2Mb point, but you shouldn’t need to worry too much about the exact sizes of chunks unless you are trying to optimise performance for really large pointclouds.
You also don’t need to worry about exactly matching the numbers used in our .NET objects, since that number is only used when serializing (sending) the objects.
I’m not 100% sure if this property is used/supported by all of our connectors/the viewer. @clrkng might know more.