Building the Future

BUILDSPACE is on a mission to revolutionize urban development by providing cutting-edge services at both building and city scales. Leveraging EGNSS-based applications and Copernicus data, the platform seamlessly integrates external sources, ensuring accessibility through standard APIs. BUILDSPACE's core offerings include five groundbreaking services such as Digital Twin Generation, employing UAV, thermography, and SLAM for precise building assessments. At the city level, BUILDSPACE provides Building Environment Climate Scenarios, Urban Heat Analysis, and Urban Flood Resilience. 

One of the pivotal services offered by BUILDSPACE is Service 2, designed with the primary goal of enhancing the digital twin. This involves the capture of thermal and RGB images of buildings to create a comprehensive model. Employing artificial intelligence, the service goes further to detect the quality of the building's envelope.

In recent months, and in collaboration with MATEMAD project, BUILDSPACE were focus on the optimization of drone flights and image processing to obtain point clouds. 

To generate a 3D model using 2D images (photogrammetry) is necessary to calibrate the cameras to obtain the intrinsic and extrinsic parameters of it. This calibration is important because the cameras have some distortion in the corners, and the captured reality is distorted. In the process of generating a model using images, one must take distortion into account to ensure the creation of a model that accurately reflects reality. 

In the BUILDSPACE project, our drones are equipped with dual cameras capturing both infrared and RGB images simultaneously. Both cameras have different positions, parameters and resolutions, so they must be calibrated simultaneously. The calibration mesh poses a challenge in its creation, requiring careful observation in both visible and thermal imaging to ensure accuracy and effectiveness. 

Two versions of the mesh were developed, the first where different materials were tested to choose those that behaved best in different specific situations; and the second, the final mesh, which was used to calibrate the drone.

 

Mesh Manufacturing Process

Image 1. Mesh Manufacturing Process

 

Image 2. Material Measurement with Handheld Thermal Imaging Camera

Image 2. Material Measurement with Handheld Thermal Imaging Camera

 

Image 3. Resu lts of the calibration (RGB and thermal)

Image 3. Resu lts of the calibration (RGB and thermal)

 

Calibrating the camera proved essential for 3D model generation and contributed to significant reductions in both flight and image processing times. Initially, the process involved two separate data processing steps for thermal and RGB images. However, through calibration, we streamlined it into a single processing step for both images. Furthermore, the drone flight planning, which originally required an 85% image overlap, was optimized to just 50% overlap, resulting in fewer photographs per flight and a more efficient drone trajectory.

Diagram 1. Flight Processing Optimization

Diagram 1. Flight Processing Optimization

Special thanks to Universidad Politécnica de Madrid (Service 2) for this update, and for its contributors: 

  • Raquel Burgos Bayo 
  • Luis Javier Sánchez Aparicio 
  • Serafín López Cuervo Medina 
  • Julián Aguirre de Mata 
  • Patricia San Nicolás Vargas 
  • Pablo Sanz Honrado 
  • Rubén Santamaría Maestro 

To stay updated on the latest developments and progress of the #BUILDSPACEproject, visit our website and follow us on our social media. 

LinkedIn: BUILDSPACE project 

Twitter: @BUILDSPACE_EU