RE: 🤖 Speckle Actions! Suggestions?

Hey Jonathon! I’m trying to set up a lambda function to retrieve some data from a speckle stream.

I’m getting an error:

 "errorType": "IOException",
  "errorMessage": "Read-only file system : '/var/task/Speckle'",
  "stackTrace": [
    "at System.IO.FileSystem.CreateDirectory(String fullPath)",
    "at System.IO.Directory.CreateDirectory(String path)",
    "at Speckle.Core.Transports.SQLiteTransport..ctor(String basePath, String applicationName, String scope)",
    "at Speckle.Core.Api.Operations.Receive(String objectId, CancellationToken cancellationToken, ITransport remoteTransport, ITransport localTransport, Action`1 onProgressAction, Action`2 onErrorAction, Action`1 onTotalChildrenCountKnown, Boolean disposeTransports, SerializerVersion serializerVersion)",
    "at Speckle2Neo.Function.GetLatestCommitOnStream(Account account, String streamId, String branchName)",
    "at Speckle2Neo.Function.FunctionHandler(String streamId, ILambdaContext context)",
    "at lambda_method1(Closure , Stream , ILambdaContext , Stream )",
    "at Amazon.Lambda.RuntimeSupport.Bootstrap.UserCodeLoader.Invoke(Stream lambdaData, ILambdaContext lambdaContext, Stream outStream) in /src/Repo/Libraries/src/Amazon.Lambda.RuntimeSupport/Bootstrap/UserCodeLoader.cs:line 145",
    "at Amazon.Lambda.RuntimeSupport.HandlerWrapper.<>c__DisplayClass8_0.<GetHandlerWrapper>b__0(InvocationRequest invocation) in /src/Repo/Libraries/src/Amazon.Lambda.RuntimeSupport/Bootstrap/HandlerWrapper.cs:line 56",
    "at Amazon.Lambda.RuntimeSupport.LambdaBootstrap.InvokeOnceAsync(CancellationToken cancellationToken) in /src/Repo/Libraries/src/Amazon.Lambda.RuntimeSupport/Bootstrap/LambdaBootstrap.cs:line 176"
  ]
}

I’m assuming this is because only the tmp/ directory is writeable in AWS? What is speckle trying to write to file when receiving data?

What’s the recommendation for working with serverless functions (if there is one) in .net to access speckle data? Speckle.Core, REST, or GraphQL?

Any help would be appreciated.

public class Function
{

    /// <summary>
    /// A simple function that takes a string and does a ToUpper
    /// </summary>
    /// <param name="input"></param>
    /// <param name="context"></param>
    /// <returns></returns>
    public async Task<string> FunctionHandler(string streamId, ILambdaContext context)
    {
        var account = new Account
        {
            token = "xxx",
            serverInfo = new ServerInfo
            {
                url = "https://speckle.xyz/"
            }
        };

        var client = new Client(account);

        var data = await GetLatestCommitOnStream(account, streamId, "main");

        var result = $"Latest commit id on branch {streamId}/main: { data.ToString()}";

        return result;

    }

    private async Task<string> GetLatestCommitOnStream(Account account, string streamId, string branchName)
    {
        var client = new Client(account);
        var branch = await client.BranchGet(streamId, branchName, 1);
        var objectId = branch.commits.items[0].referencedObject; // take last commit

        var transport = new ServerTransport(account, streamId);

        var data = await Operations.Receive(
          objectId,
          remoteTransport: transport,
          disposeTransports: true
        );

        var jsonData = System.Text.Json.JsonSerializer.Serialize( data );

        return jsonData;
    }

}
2 Likes

It is the writing of the SQLite cache that is failing on write to disk.

I have two options for you to try:

  1. Use In-memory caching for the lifetime of the lamda instance.
var inMemoryTransport = new MemoryTransport();

var data = await Operations.Receive(
  objectId,
  remoteTransport: transport,
  localTransport: inMemoryTransport // will use this one instead of the global cache!
);

This should be fine except instances of very large objects or commits

To use /tmp for the ephemeral storage of SQLite

var ephemeralSqlTransport = new SQLTransport(basePath: @"/tmp");

var data = await Operations.Receive(
  objectId,
  remoteTransport: transport,
  localTransport: ephemeralSqlTransport);

(the specifics of the path you’ll need to check)


Depending on the AWS flavour of lambdas you might be limited here to 512MB. There are options to use other ephemeral storage, but AWS changes so quickly I’m not sure. Try with one until something breaks and try the other. If the lambda can be flexed in Memory then I’d prefer to do that.

Additional memory or different storage will likely need paying for…

1 Like

Much appreciated, @jonathon. Will try your suggestions and report back.

1 Like

MemoryTransport works with default 512mb memory on lambda.

Thank you!

2 Likes

Your welcome @d3ssy, and welcome also to the Speckle Community. This was a great first question so worth making public.

I was curious whether it was SQLiteTransport or MemoryTransport you used in the end. Text said one thing and image the other… but mostly I’m happy it worked.

Feel free to introduce yourself :wave: if you want. Hope to see what you might be building/automating with Speckle :speckle:

2 Likes

I used memTransport in the end with 512mb allocation.

var inMemoryTransport = new MemoryTransport();

var data = await Operations.Receive(
  objectId,
  remoteTransport: transport,
  localTransport: inMemoryTransport // will use this one instead of the global cache!
);
1 Like

Yeah @d3ssy don’t be shy, you can tell us what you’re building :smiley:

4 Likes

I finally got something going that’s worth sharing! I’m building a knowledge graph to capture end-to-end project lifecycle data, from design to operations, ingesting data from various sources and domains along the way. In this context, Speckle is a data source representing any data associated with geometry (e.g. architectural design, structural skeleton, spatial massings).

So the first thing I wanted to do was build an automated data pipeline from speckle to Neo4J that would allow me to ingest commits on a specific stream and update the graph.

Running this in that Lambda Function above triggered via a webhook on a speckle stream, but I will probably move to a custom speckle server. Thoughts?

Speckle2Neo connector anyone? :slight_smile:

5 Likes

Looks fun! I remember playing with neo4j as well a bunch of years ago :slight_smile:

1 Like