Pollination ecosystem in jupyter (a happy tale and some code!)

came back to edit a bug out of the snippet
Long story short:
Nothing gets me STOKED like some radness from things LBT related.
First few days working heavy with PO API/SDK and streamlit: THRILLED.
I’d needed to get some data from the overall job dataframe (LBT pandas? YES.)

about 6000 runs via 4 jobs; needed to get the user inputs as am building some reporting PCP plots for a parametric study, some borrowing and ‘abuse’ of some pollination-streamlit modules later and this is what has me so stoked, and wanted to share:

def download_inputs(job: Job, results_folder):
   avoid_inputs = (
        'additional-string', 'ddy', 'epw', 'model', 'units', 'viz-variables')
   runs = job.runs
   results_folder = results_folder # data/applicable_job_folder


   df = job.runs_dataframe.dataframe
   todrop = ['additional-string', 'ddy', 'epw', 'model', 'html', 'idf', 'job-id', 'run-status', 'sim-par', 'sql', 'zsz', 'err']
   df = df.drop(todrop, axis=1)
   df.rename(columns={'run-id':'run_id'}, inplace=True)
   
   for run in runs:
      output_folder = os.path.join(results_folder, run.id)
      item = df.loc[df.index == run.id]
      item.to_csv(os.path.join(output_folder, 'user_input.csv'))
      
   return


def_url = 'full url to PO job'

api_key = None  
job = job_selector(api_key = api_key, default=def_url)

res_fldr = 'data\local_job_dir_name'

download_inputs(job, res_fldr)

I REALLY like pandas and use it sometimes daily; these capabilities just have me super stoked!
thanks LBT crew! thanks for lbt pandas @antoinedao!!! suuper awesome

4 Likes

Ha! Glad good old ladybug-pandas is coming in handy :grinning_face_with_smiling_eyes: Feels like this post is still relevant then:

2 Likes

LOL I forgot about that :rofl: :rofl: :rofl: :rofl: :rofl: :rofl: