Any automated way to generate Sensor Grid?

Hi there,
i am starting to look into pollination cloud so excuse if the question does not really fit here…
I am trying to find a way to use a very roughly meshed model (for example osm / cityjson etc) with pollination cloud without the use of rhino / gh etc. the problem i see so far is, that i do need some kind of remeshing to at least a sensor grid where i then would be able to run the simulation on.
Does pollination cloud provide any tools / features to use just plain meshes? or remesh? or would i have to do the meshing/sensorgrid generation?

Thank you so much,
p

Hi @pcace,

You can use Ladybug Tools core libraries to generate a sensor grid from the geometries. @antonellodinunzio should be able to provide an example.

There are also other 3rd party libraries like meshio that you can use in your app to generate meshes that can then be translated to sensor grids.

Hi @pcace,

You can use core libraries to create a custom script to generate sensor grids. In particular:

  1. ladybug-geometry to customize how the base geometry should be
  2. honeybee-core to understand how to manage rooms if it is room-based
  3. honeybee-radiance for sensor grids

Obviously, it is up to you how the logic should be and what kind of post-process you are looking for for 1.

For example, this is a function based on honeybee-grasshopper-radiance that uses honeybee-core, honeybee-radiance and ladybug-geometry where grids are generated by floors.

def create_room_grids(rooms: List[Room],
    grid_size: Optional[float]=1,
    dist_floor: Optional[float]=0.8,
    remove_out: Optional[bool]=True,
    wall_offset: Optional[float]=0.0):
    # TODO: add advanced quad only
    x_axis = None

    # create lists to be filled with content
    grid = []
    clean_rooms = []
    for obj in rooms:
        if isinstance(obj, Model):
            clean_rooms.extend(obj.rooms)
        elif isinstance(obj, Room):
            clean_rooms.append(obj)
        else:
            raise TypeError('Expected Honeybee Room or Model. Got {}.'.format(type(obj)))

    for room in clean_rooms:
        # get all of the floor faces of the room as Breps
        lb_floors = [face.geometry.flip() for face in room.faces if isinstance(face.type, Floor)]

        if len(lb_floors) != 0:
            # create the gridded ladybug Mesh3D
            # if quad_only:  # use Ladybug's built-in meshing methods
            if x_axis:
                lb_floors = [Face3D(f.boundary, Plane(f.normal, f[0], x_axis), f.holes)
                              for f in lb_floors]
            lb_meshes = []

            # TODO: generate a log file for mesh grids errors
            for geo in lb_floors:
                try:
                    lb_meshes.append(geo.mesh_grid(grid_size, 
                        offset=dist_floor))
                except AssertionError as e:
                    print(e)
                    continue
            
            lb_mesh = lb_meshes[0] if len(lb_meshes) == 1 else Mesh3D.join_meshes(lb_meshes)

            # remove points outside of the room volume if requested
            if remove_out:
                pattern = [room.geometry.is_point_inside(pt)
                           for pt in lb_mesh.face_centroids]
                try:
                    lb_mesh, vertex_pattern = lb_mesh.remove_faces(pattern)
                except AssertionError:  # the grid lies completely outside of the room
                    lb_mesh = None

            # remove any sensors within a certain distance of the walls, if requested
            if wall_offset and lb_mesh is not None:
                wall_geos = [f.geometry for f in room.faces if isinstance(f.type, Wall)]
                pattern = []
                for pt in lb_mesh.face_centroids:
                    for wg in wall_geos:
                        if wg.plane.distance_to_point(pt) <= wall_offset:
                            pattern.append(False)
                            break
                    else:
                        pattern.append(True)
                try:
                    lb_mesh, vertex_pattern = lb_mesh.remove_faces(pattern)
                except AssertionError:  # the grid lies completely outside of the room
                    lb_mesh = None

            if lb_mesh is not None:
                # extract positions and directions from the mesh
                base_poss = [(pt.x, pt.y, pt.z) for pt in lb_mesh.face_centroids]
                base_dirs = [(vec.x, vec.y, vec.z) for vec in lb_mesh.face_normals]

                # create the sensor grid
                s_grid = SensorGrid.from_position_and_direction(
                    clean_rad_string(room.display_name), base_poss, base_dirs)
                s_grid.display_name = room.display_name
                s_grid.room_identifier = room.identifier
                s_grid.mesh = lb_mesh
                s_grid.base_geometry = \
                    tuple(f.move(f.normal * dist_floor) for f in lb_floors)

                # append everything to the lists
                grid.append(s_grid)
    
    return grid

Best,
Antonello