Vera Rubin will help us find the weird and wonderful things happening in the solar system

By repeatedly surveying the sky, the VRO will spot any changes or astronomical transients. Astronomers call this type of observation time domain astronomy.

When the VRO spots something transient in the night sky, it'll automatically send alerts out to other observatories that will observe the transient object in detail. It could be a distant supernova explosion, a hazardous asteroid here in the inner solar system, or anything that registers a change in the sky. The VRO's job is to spot it and then pass the baton to other observatories.

But issuing alerts to other telescopes is just one of the things the VRO will do. The VRO's primary observing program is called the Legacy Survey of Space and Time (LSST.) The LSST will catalogue the entire available by imaging it every night for 10 years with its massive 3.2 gigapixel camera. Every five seconds, the camera will point to a different part of the sky and capture a 15-second exposure.

This decade-long effort will generate an enormous amount of data. It'll take 200,000 images per year, amounting to 1.28 petabytes of data. There'll be so much data that the VRO project includes a new data pipeline traveling from its site in northern Chile back to the U.S. There's no way that people can process all the data, so machine learning will play a big role in handling it and finding what's hidden.

The authors of a new research paper developed a novel way for the observatory to detect anomalies in the immense amount of data it generates. The paper is "The Weird and the Wonderful in our Solar System: Searching for Serendipity in the Legacy Survey of Space and Time." It's been accepted for publication in The Astronomical Journal, and the lead author is Brian Rogers from the Department of Physics at the University of Oxford. It is available on the arXiv preprint server.

The Vera Rubin Observatory at twilight on April 2021. It's been a long wait, but the observatory should see first light later this year. Credit: Rubin Obs/NSF/AURA

In simple terms, neural networks are a subset of machine learning, which is a subset of artificial intelligence. Without these tools, astronomers would have no hope of processing all of the data the VRO will generate. Credit: Evan Gough

This simple schematic illustrates the general architecture of an autoencoder. It takes input, encodes it into a latent representation of the input, then decodes it and outputs it. Credit: Rogers et al. 2024

This figure from the research shows how the autoencoder can measure reconstruction loss in its latent space. It shows reconstruction scores for 3.1 million Solar System objects across the reduced feature space. Blue dots, which are tiny, represent objects with low reconstruction loss. Anomalous objects are shown with enlarged dots of redder colors. "The top 0.01% anomalies are enlarged for this plot. They lie distant from the majority of normal objects in blue," the authors write. Credit: Rogers et al, arXiv (2024). DOI: 10.48550/arxiv.2401.08763

Each of these panels is a different autoencoder output for the top ten anomalies and their data neighbors. The blue are normal objects, and the colored dots show how the top ten anomalies relate to them. In this figure, ISOs are number 6, shown in red. The critical takeaway is that the anomalies are easily distinguished from normal objects and are grouped by certain characteristics like orbital eccentricity or magnitude. Credit: Rogers et al, arXiv (2024). DOI: 10.48550/arxiv.2401.08763