Augmented Reality: AR sunlight estimation

No Comments

Shadows play a vital role in perceiving spatial relations between objects. They are an essential tool for creating authentic virtual additions to the scene in Augmented Reality (AR) applications. Approaches matching real-world light conditions in AR usually rely on resource-hungry image processing. Often, a-priori information about the environment and additional physical objects (like mirror spheres) are needed.
As part of my Bachelor’s thesis, I created an Android app exploring the possibilities of sunlight simulation in AR. To aid usability of consumer applications, it’s best to refrain from manual calibration and additional hardware needs. Keeping this in mind, I’d like to present a performant sunlight estimation technique relying solely on Google ARCore and simple calculations. This makes it well suited to run on a smartphone supported by ARCore.

Sunlight estimation

Sunlight estimation currently consists of four major steps:

  1. The sun position in the sky is calculated for the current device location on earth and its local time.
  2. This position is transformed into the world coordinate system of the scene using orientation sensors of the device.
  3. A directional light source is updated using the computed transform.
  4. Environmental influences are simulated if possible and feasible.

Calculating the sun position

The sun’s horizontal coordinates, consisting of altitude and azimuth, describe its position in the sky. Astronomers understand altitude to be the sun’s angle between its zenith (directly above the observer) at 90° and its nadir (directly below) at −90°. When touching the astronomical horizon, it’s 0°. Azimuth, on the other hand, is the angle along the observer’s astronomical horizon, which again astronomers traditionally define as:

  • 0° in the south
  • 90° in the west
  • -90° in the east
  • 180° in the north

The current time is given as a UNIX timestamp and, after adding the local time zone offset, converted into a Julian Date (JD) for further calculation. An excellent article by Dr Louis Strous provides formulas for calculating the sun’s horizontal coordinates. I highly suggest at least partially reading this article before continuing. The implementation of relevant formulas (1, 6–9, 17, 19, 20–25) can be found here.

After transforming the horizontal coordinates of the sun into the scene’s coordinate system, they are applied to the virtual sun.

Quaternion sunRotation = Quaternion.multiply(
  Quaternion.axisAngle(Vector3.up(), (float) sunCoordinates.azimuth),
  Quaternion.axisAngle(Vector3.right(), (float) sunCoordinates.altitude)
);
 
// Using northRotation (the angle between the device's camera and geographic north)
Quaternion localSunRotation = Quaternion.multiply(northRotation, sunRotation.inverted());
 
// Apply the rotation to the default Sceneform sun node.
sun.setWorldRotation(localSunRotation);

With sufficient sensor accuracy, a result as seen in the image below can be achieved. Shadows are not being merged, as the application has no information about real shadows in the scene.

ar sunlight estimation

Environmental influences

Magnetometer (compass) precision is of utmost importance for an accurate representation of the sun. When using the application, it’s crucial to keep a distance from magnets, electrical appliances or large metal objects. These will negatively affect the accuracy of the compass. Furthermore, inaccuracies are introduced by magnetic declination. It describes the dicrepancy between the geographic and magnetic poles. Depending on where you are on earth, it can vary greatly. See the table below for examples.

PlaceCoordinatesMagnetic Declination
Madrid, Spain40.44° N, 3.69° W0.61° W ± 0.32°
New York, USA40.71° N, 73.91° W12.92° W ± 0.36°
Cape Town, South Africa33.97° S, 18.48° E25.41° W ± 0.57°
Qaanaaq, Greenland77.48° N, 69.35° W45.28° W ± 1.30°

Attentive readers might now have noticed that magentic declination correction is not yet taken into account. Luckily, this can easily be corrected using code from the OpenSphere application together with the current location and altitude. It is then added to the current device azimuth.

Building shadows

Another interesting topic of sunlight estimation is the lack of it caused by shadows. While those produced by trees or other objects are impossible to predict, development around the user is a different story. The collaborative map project Open Street Map offers extensive, user-generated 3D building data. OSM defines buildings as a collection of latitude/longitude polygons with associated heights. See their reference for details.
The OSMBuildings API is used to retrieve building data. The earth is approximated as a perfect sphere with 40075km circumference to transform a given set of latitude/longitude vertices into the virtual world.

 
public static Vector3 getVectorInMeters(double startLat, double startLon, double endLat, double endLon) {
  double latitudeDiff = endLat - startLat;
 
  return new Vector3(
    // 1 deg of latitude is 111.32km
    latitudeDiff * 111320.f,
    // 1 deg of longitude is 40075km * cos(lat) / 360
    ((endLon - startLon) * 40075000.f * (Math.cos(Math.toRadians(startLat)) / 360.f)),
    // height difference is always set to 0
    0
  );
}

Sceneform renderable definitions are created using the calculated vertices. See the code here. The following image shows a side-by-side view of OSM buidling data and the transformed, scaled-down buildings in the application.

Simulated buildings in the application

A high GPS accuracy is needed to accurately place buildings, which can usually not be achieved with a smartphone. Differential GPS or better sensors could yield a more satisfactory result.

Use Case

A potential use case of sun shadow simulation presents itself in manufacturers of sun blinds or awnings. The application provides a convenient way to show to a potential customer how a new sun blocking addition to their building or patio will affect the area around it. The customer could be shown different sun positions throughout the day or year. When used in this context, light being blocked by buildings could most certainly be ignored, given that a blind would not be necessary at all if the area is in shadow all day. In addition to that, the presentation should usually aim at representing a sunny day, regardless of real weather conditions. The issue of sunlight being altered by semi-blocking objects (like trees) remains.

Conclusion

I presented a resource-friendly approach simulating sun shadows in an AR application. It should be kept in mind that results are not expected to be of equal accuracy as real image-based light estimation. Rather, the approach serves as an alternative where laboratory conditions cannot be guaranteed. Improvements can be made by using higher-precision GPS sensors for more accurate building placement.

This post is a summary of Paul’s Bachelor’s thesis, which he wrote at codecentric Karlsruhe.

Comment

Your email address will not be published. Required fields are marked *