To summarize, there are 4 major behavioral changes that need to be made:
Headless Authentication: Login should be allowed using something like an API Key + Secret via parameters in a component.
Computation Pauses: Downstream computation should be made to wait when running on a cloud GH instance. Rather than scheduling a recomputation once the run is complete or assets have finished loading, the components’ SolveInstance method should wait for these steps to complete (Synchronous rather than Asynchronous) before they output any data. The end result of this should be no more null computations with empty outputs, when running on the cloud.
Local Filesystem Management: While most cloud systems shouldn’t mind temporary file creation, they will probably mind if those files persist after their usefulness has expired. Any temporary files that need to be created should follow a purge of the temporary folder in which they are to be created. Furthermore, the end-user should not have the ability to define the location of the temporary folder.
Minor UI-related adjustments: Some UI-only behavior needs to be disabled when running on the cloud, such as dynamic output parameters of components, “run” Boolean parameters and UI Buttons on the components, etc.
These changes can be made in a way such that the local use of PCGH is unaffected - a simple file present somewhere accessible should be able to tell the plugin that it is running on a cloud system, and hence should behave in a different way.
Below is a diagram that might help understand the behavioral changes a little better:
Thanks for your detailed explanation, and nice diagrams.
Yes, we are aware of these issues on cloud based Grasshopper platforms, and we are also working with some of these platforms for solutions.
Headless Authentication: yes, we will provide “login by token” option.
Computation Pauses: we already have an option for turning on/off “non-blocking”, please see updates from here.
Local Filesystem Management: for each job/run, there is an id associated, which is used for creating folders under temporary files. Local temporary folder is from Path.GetTempPath Method (System.IO) | Microsoft Docs, which is not override-able by user, and operating system will automatically clean up this temporary folder periodically.
Minor UI-related adjustments: we are reviewing some of limitations with other cloud platform providers. Will keep you posted once we have any updates.
Thanks for the update @Mingbo , this is great!
Glad to know that you’re already making progress in this direction.
I tested the new version and can confirm that the Computation Pauses issue is resolved with the ‘Non-Blocking’ option. However, each blocking component needs to be individually set to the non-blocking mode. I fear that this might lead to the need for a guide/extra documentation for how to use Pollination Cloud on the Cloud.
Are you planning some kind of ‘Cloud-mode’ for the plugin? Such that the end user does not have to worry about modifying their script too much in order to get it running on the cloud - the plugin can detect / be informed that it is about to be running on the cloud and can automatically change it’s behavior.
Regarding what you have mentioned about Local Filesystem Management, I am unsure of whether this is the case - while the mentioned method does return a path to the default temp folder, I am not convinced that this folder is automatically cleaned by Windows periodically, as this is not mentioned anywhere in Microsoft’s documentation. Check: c# - Are files in the temporary folder automatically deleted? - Stack Overflow
Also, using the folder_ parameter, I am still able to override the temp file storage location, as shown below:
I have 2 more technical concerns, which are ShapeDiver specific:
Versionstring override: The Version property present in the GH_AssemblyInfo inheriting class has not been overridden. This property is used by our cloud system to ensure that only supported versions of the plugins are allowed to be uploaded. You can override this property programmatically as well using information from the latest build version. The end result should be the mention of this version string in a .ghx file of a script containing components from the plugin.
Version of Grasshopper Assemblies built against: Due to various reasons, we prefer to ensure compatibility with Rhino 6.12 onwards. McNeel has a good track record of development practices that ensure backwards compatibility, so it should not affect the plugin if you roll back to 6.12, rather than 6.23. Check out this issue on Bowerbird: RhinoCommon and Grasshopper Assembly versions · Issue #76 · oberbichler/Bowerbird · GitHub . This will ensure a wider compatibility for the plugin as an added benefit.
Another point of concern I overlooked:
EPW files are loaded in using the ‘Download EPW’ component in Ladybug. Are you working on an alternative to this? Maybe some kind of ‘Weather’ Param type? This way the EPW data can be easily internalized, making it simple to upload the script for running on the cloud.
UPDATE regarding Local Filesystem Management: It’s possible to allow the current method of temporary file management, as we can manually clear the temp folder periodically using a simple script. However, we would need to have a better understanding of the nature of the files stored. We’d need more information about the expected size of files, required persistence, and behavior in case multiple instances of GH are running.
Hi @pmArchitect, thank you for all of your comments. Here are some thoughts in addition to what @Mingbo has already shared.
Do you support secret inputs on ShapeDiver? Something like password input where users input is not visible? I feel it is safer to have the API key as an input to the script where each user can input their API key instead of having one saved inside the GH script. That can easily expose the key.
I just wanted to make sure that you already know about the option for the Pollinate component. You can right click on the component and change the button input to be a normal input.
For dynamic outputs since the recipe outputs for a specific version never changes they will be like static components after the first time that someone generates/expands them. They should work without the need to interact with them from the UI after that.
Would loading the files from a local machine be an option too? In theory you can add a custom Python component to do that for you and read/create the epw object from a string but it will be tricky and will limit the usage of the script - with an internalized weather file the script will only work for that location which is not usually the desired behavior for a script.
Hi @pmArchitect, @Mingbo and I had a call today and we have some ideas of how to get it to work. We are a bit concerned about exposing the APIKey input. It’s simple but it can introduce a security issue where users forget to remove the APIKey and things can get really out of control.
I will send a follow up email to schedule for a call but I wanted to post this note here for anybody else who is following this thread on Discourse.