Unable to load a large number of runs as one job

Hi,
For some reason I’m unable to load a large number of runs as one job.
When trying to upload 2187 runs after a couple of minutes of uploading I get an error “Error calling CreateArtifact: The request timed-out.”

These same runs load successfully when divided into two parts, however my final goal requires a lot more simulations, which makes this approach difficult to use.

What is the cause of this problem and is there a more convenient and faster way to upload runs to the server? Thanks!

Hi @artmouser, sorry for the delayed response.

The Grasshopper plugin tries to do a few tasks together and that can become a bottleneck in cases like yours. In particular, in a standard workflow, you can upload the files to Pollination first and then reference them in the runs.

Our initial hope was that users will customize the recipes so for example like this you would have uploaded all your geometry scenarios and then use a recipe that looks up the folder and starts the runs. In that case, because the files have been already uploaded to Pollination, they can be picked up by each run without facing issues like the one that you are facing here.

@mingbo, is the time-out limit something that we can increase to avoid scenarios like this?

Hi,
I have encountered the same error when running more than 50 simulations in my workflow. Is it possible to increase that timeout limit?

@mingbo and I have talked about this a few times. I don’t think increasing the timeout will solve the problem, as there will always be larger studies with more files.

I think we should either change the process to upload the files one by one and update the runs to use those files similar to how we do it in the sample workflows like this (How to submit an energy simulation to Pollination using the API - #9 by mostapha) or provide a solution to upload the files separately to your project first, and then use the reference to those files to set up the runs quickly.

2 posts were split to a new topic: How to add a file or a folder as a recipe input

Hello, I am also struggling to upload 400 simulation runs. I’m just curious if there was any solution except splitting the samples.

Hi @prateekwahi,

You can use a Python script similar to this one here to submit large studies.

I also know that @mingbo started breaking down the steps for uploading the input files and setting up the runs. Once that is ready you should be able to set up larger runs from Grasshopper too.

Oh great !! I will check this out.
Also, I wonder if this script would help download or read the results faster. Compared to the current workflow with ladybug fly or Collibri iterator to read through all the jobs.

Yes. It does but you might need to modify the script based on the recipe and the output(s) that you are interested in.

1 Like

A post was split to a new topic: The best way to generate hundreds of HBJSON files to use with Pollination

A post was split to a new topic: Error when downloading the results of a large study