Failing to send layer from QGIS (large Geojson with +120k entities)

Hello,

I’m trying to send objects (point entities) from a Geojson loaded in QGIS.

I’m getting a 400 error (see screenshot above)

The GeoJSON is available here: Signalisation permanente — SNCF Open Data

https://ressources.data.sncf.com/api/explore/v2.1/catalog/datasets/signalisation-permanente/exports/geojson?lang=fr&timezone=Europe%2FParis

Any clue what could go wrong here?

Hi @pauldotnet - were you sending to speckle.xyz?
If so, could you please share the url of the stream you were sending to?

Hey @iainsproat, thanks for the fast answer
Here it is: Speckle
There is some line network but the data I’ve attempted to send is a list of points which didn’t commit.

Thanks for sharing this, we’ve been able to identify the problem - the connector was sending data in batches to the server and one of those batches exceeded the maximum size limit imposed by the server.

I’ll coordinate with @Kateryna on how best to resolve this for you.

2 Likes

Thank you!
Actually, I’ve also tried to send the same data from the Excel Connector and it failed as well to send everything. A small portion of it worked
Branch | Speckle

2 Likes

Hi @pauldotnet - as an update, we’ve made a fix to the server which we’re currently testing. We’ll try and roll this out in the new few days/weeks.

5 Likes

Hey @iainsproat ,

I’ve gave another shot to this experiment from a few months ago and am still struggling to send this large amount of data.

Here is the log

image

Can this limit be avoided?

Thanks
Paul

Hi @pauldotnet

If you are using https://speckle.xyz I will have to investigate the implications of increasing this size. This may take a little time. In the meantime, the workaround would be to divide the object.

Alternatively, if you are running your own server, the maximum value is configurable by amending the MAX_OBJECT_SIZE_MB environment variable. The environment variable MAX_OBJECT_SIZE_MB can be added to the server container in the Docker compose file. If you are instead deploying the server using the Kubernetes helm chart, the equivalent is the server.max_object_size_mb value.

Kind regards,

Iain

1 Like

Hi Paul! What you can try is removing all layer attributes that you don’t need. They might take some space unnecessarily

Hi @Kateryna! thanks for the tip.
I’ve removed all attributes except one on all features, and the result is pretty much the same.

Before removing:

image

After removing:

image

:cold_sweat:
This looks like one of the features might be causing the issue, as features of each layer are uploaded as individual objects and are not aggregated against the size limit. Following your test, removing attributes has reduced the size slightly, so could you also try

  1. removing all attributes so we know for sure that this attribute is problematic? One of my guesses is that this attribute has a massive paragraph of text written there.

  2. Is the layer of type Point or Multipoint? The features of each layer are uploaded separately, but multiple points within one feature are not. This is my another guess: one feature having thousands of points treated as one object. If this is the case, it should be an easy fix.

Let me know👌

1 Like

Oops I realized that you have attached the dataset in your first message. Neither of the above assumptions is true, so let us think what’s another possible issue :melting_face:

1 Like

Ah I was just about to answer :slight_smile:
Yeah, I tried removing the last attribute, the size still decreased a bit, but still not enough:
image

Double checked the geojson, there are only individual points, no multi-point…

1 Like

Good news: issue found
Bad news: unfortunately, there might be no immediate solution to this :pensive: It’s on the list, but it’s a bigger question on Object structure than just resolving QGIS features.

My best suggestion for now would be to split the layer in half and send as 2

3 Likes

Ok, thank you for investigating this!

1 Like