APPARILLO: a fully operational and autonomous staring ...

24
Vol.:(0123456789) 1 3 CEAS Space Journal https://doi.org/10.1007/s12567-021-00380-6 ORIGINAL PAPER APPARILLO: a fully operational and autonomous staring system for LEO debris detection Paul Wagner 1  · Tim Clausen 1 Received: 22 February 2021 / Revised: 20 May 2021 / Accepted: 15 June 2021 © The Author(s) 2021 Abstract For safe operation of active space crafts, the space debris population needs to be continuously scanned, to avoid collisions of active satellites with space debris. Especially the low Earth orbit (LEO) shows higher risks of collisions due to the highest density of orbital debris. Laser ranging stations can deliver highly accurate distance measurements of debris objects allowing precise orbit determination and more effective collision avoidance. However, a laser ranging station needs accurate a priori orbit information to track an orbital object. To detect and track unknown orbital objects in LEO, here, a passive optical star- ing system is developed for autonomous 24/7 operation. The system is weather-sealed and does not require any service to perform observations. To detect objects, a wide-angle imaging system with 10° field of view equipped with an astronomical CCD camera was designed and set up to continuously observe the sky for LEO objects. The system can monitor and process several passing objects simultaneously without limitations. It automatically starts an observation, processes the images and saves the 2D angular measurements of each object as equatorial coordinates in the TDM standard. This allows subsequent initial orbit determination and handover to a laser tracking system. During campaigns at twilight the system detected up to 36 objects per hour, with high detection efficiencies of LEO objects larger than 1 m 3 . It is shown that objects as small as 0.1 m 3 can be detected and that the estimated precision of the measurements is about 0.05° or 7 × the pixel scale. Keywords Passive optical staring · Orbital debris · Space surveillance · Space situational awareness · Low earth orbit · Initial space debris detection 1 Introduction The number of space debris objects is increasing constantly, putting active satellites into a higher risk of collisions. Even small debris particles with a size of 1 cm can cause major damage to a satellite, and this cascading effect causes an exponential growth to the debris population. To avoid colli- sions between active satellites and debris, the orbital debris population needs to be constantly scanned and cataloged. To keep orbital objects cataloged their position needs to be measured frequently with high precision, which is required for precise orbit determination. Using the resulting orbit predictions, the risk of a collision between a satellite and a debris object can be calculated and collision avoidance maneuvers can be performed subsequently. Of special interest is the LEO as it shows the highest density of debris fragments. For residential space objects (RSO) in LEO, only tracking radar or laser ranging sensors are capable of deliv- ering good enough (radar) or highly accurate (laser rang- ing) data for predictions. Current prediction uncertainties of debris RSO in LEO are based on radar measurements and require a large safety margin, resulting in 10,000:1 false alert rate [1]. Laser ranging measurements, on the other hand, have an order of magnitude better precision of the distance measurement. This allows much better orbit prediction [2] hence allowing more effective collision avoidance maneu- vers between an active satellite and orbital debris [1]. Due to the small field of view (FOV) of a laser tracking system and small laser beam divergence, the RSO needs to be tracked continuously with high accuracy. This requires a priori orbit information of the RSO, which needs to be obtained by a separate sensor network. Currently, only radar systems in staring mode can fulfill this task of ini- tial detection of unknown RSO in LEO. Their downside is their high hardware and operating cost. For this reason, * Paul Wagner [email protected] 1 DLR, Institute of Technical Physics, Pfaffenwaldring 38-40, 70569 Stuttgart, Germany

Transcript of APPARILLO: a fully operational and autonomous staring ...

Page 1: APPARILLO: a fully operational and autonomous staring ...

Vol.:(0123456789)1 3

CEAS Space Journal https://doi.org/10.1007/s12567-021-00380-6

ORIGINAL PAPER

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

Paul Wagner1  · Tim Clausen1

Received: 22 February 2021 / Revised: 20 May 2021 / Accepted: 15 June 2021 © The Author(s) 2021

AbstractFor safe operation of active space crafts, the space debris population needs to be continuously scanned, to avoid collisions of active satellites with space debris. Especially the low Earth orbit (LEO) shows higher risks of collisions due to the highest density of orbital debris. Laser ranging stations can deliver highly accurate distance measurements of debris objects allowing precise orbit determination and more effective collision avoidance. However, a laser ranging station needs accurate a priori orbit information to track an orbital object. To detect and track unknown orbital objects in LEO, here, a passive optical star-ing system is developed for autonomous 24/7 operation. The system is weather-sealed and does not require any service to perform observations. To detect objects, a wide-angle imaging system with 10° field of view equipped with an astronomical CCD camera was designed and set up to continuously observe the sky for LEO objects. The system can monitor and process several passing objects simultaneously without limitations. It automatically starts an observation, processes the images and saves the 2D angular measurements of each object as equatorial coordinates in the TDM standard. This allows subsequent initial orbit determination and handover to a laser tracking system. During campaigns at twilight the system detected up to 36 objects per hour, with high detection efficiencies of LEO objects larger than 1 m3. It is shown that objects as small as 0.1 m3 can be detected and that the estimated precision of the measurements is about 0.05° or 7 × the pixel scale.

Keywords Passive optical staring · Orbital debris · Space surveillance · Space situational awareness · Low earth orbit · Initial space debris detection

1 Introduction

The number of space debris objects is increasing constantly, putting active satellites into a higher risk of collisions. Even small debris particles with a size of 1 cm can cause major damage to a satellite, and this cascading effect causes an exponential growth to the debris population. To avoid colli-sions between active satellites and debris, the orbital debris population needs to be constantly scanned and cataloged. To keep orbital objects cataloged their position needs to be measured frequently with high precision, which is required for precise orbit determination. Using the resulting orbit predictions, the risk of a collision between a satellite and a debris object can be calculated and collision avoidance maneuvers can be performed subsequently. Of special

interest is the LEO as it shows the highest density of debris fragments. For residential space objects (RSO) in LEO, only tracking radar or laser ranging sensors are capable of deliv-ering good enough (radar) or highly accurate (laser rang-ing) data for predictions. Current prediction uncertainties of debris RSO in LEO are based on radar measurements and require a large safety margin, resulting in 10,000:1 false alert rate [1]. Laser ranging measurements, on the other hand, have an order of magnitude better precision of the distance measurement. This allows much better orbit prediction [2] hence allowing more effective collision avoidance maneu-vers between an active satellite and orbital debris [1].

Due to the small field of view (FOV) of a laser tracking system and small laser beam divergence, the RSO needs to be tracked continuously with high accuracy. This requires a priori orbit information of the RSO, which needs to be obtained by a separate sensor network. Currently, only radar systems in staring mode can fulfill this task of ini-tial detection of unknown RSO in LEO. Their downside is their high hardware and operating cost. For this reason,

* Paul Wagner [email protected]

1 DLR, Institute of Technical Physics, Pfaffenwaldring 38-40, 70569 Stuttgart, Germany

Page 2: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

we developed a passive optical staring system to detect and measure unknown orbital objects in LEO for subsequent laser tracking.

We already reported in our first results [3], that such a system is detecting 25% of objects which could not be cor-related using the publicly available TLE catalog [4]. The detection efficiency of object with a radar-cross-section (RCS) between 1 and 2 m2 was already 50%, and almost 100% for objects with an RCS larger than 2.25 m2 [5]. We also demonstrated an instant handover to our tracking tel-escope (UFO) and redetected objects, like a rocket body, within the 0.27° large field-of-view (FOV) of the tracking camera, without any a priori information [5].

While former activities required manual observation and processing, here we will present a fully operational passive optical surveillance system called APPARILLO (Autono-mous Passive Optical Staring Of LEO Flying Objects), which is operational to contribute in a space surveillance network. It is built for 24/7 autonomous operation to detect orbital objects in LEO and export their measured tracklets in the tracking data message (TDM) format [6].

The system is the foundation of future Stare and Chase handover, where an initial orbit determination (IOD) is cal-culated instantly from the measurements taken by the star-ing sensor. This orbit prediction will be sent to a tracking telescope which can perform subsequent tracking and laser ranging. The ranging data allow precise orbit determination and cataloging of a newly detected LEO RSO. This concept was previously published [7] and protected under the utility patent DE 20 2017 101 831 U1 in Germany [8].

The current system is a subsequent improvement to ensure autonomous operation, including weather sealing, automatic data recording, and processing. The photograph in Fig. 1 below shows the current system during observation campaign in December 2020. It also illustrates the detection principle of the camera system. The camera records stars as a point source and LEO objects as a streak due to their larger angular velocity. The streak recorded in the picture is the International Space Station (ISS) and the bright object in the center is the full moon.

2 Performance estimation

A passive optical sensor benefits from the fact that the sun illuminates the RSO and the reflected light from its surface can be detected via a camera. To detect this signal, the background illumination at the ground-based sensor needs to be low. A clear sky at night is required for successful operation. Furthermore, observations are not possible if the RSO is in the Earth shadow. Which is why under cer-tain observation conditions RSO are not detectable around

midnight. This, for example, is the case for zenith line-of-sight (LOS) in winter on the norther hemisphere.

To model the system performance, a spherical RSO is assumed which is illuminated by the Sun. Using the mod-eled signal-to-noise ratio (SNR) allows to calculate the RSO diameter for a set of system parameters. This gives the minimum detectable RSO diameter dRSO and is calcu-lated as follows [9],

where ypx is the pixel size, f’ the focal length, Dopt the diam-eter of the aperture, and τopt the transmission of the optical system. Furthermore, τatm is the transmission of the atmos-phere, QE the quantum efficiency of the detector, SNR the algorithm required signal-to-noise ratio, eread the read noise per pixel, Lb the background illumination, texp the exposure time, ωR the angular velocity of the RSO, RRSO the slant range to the RSO and P(ρ, ψ) the phase function of the RSO. Please see reference [9] for more details.

(1)dRSO =

√√√√√√√4�Rf

�R2RSO

SNR

√D2

opt�opticy

2pxtexp�LbQE

D2opt

+ 4f �2+ e2

read

2.755 ⋅ 1021P(�,�)�atmD2opt��optypxQE

.

Fig. 1 APPARILLO staring system (bottom right) during observation in December 2020. The system has a movable sensor head and a cabi-net. In the top of the image is the ISS visible as a streak. The bright object in the center is the full moon and to the top left Mars is visible. The system is located on the top of the DLR office building in Stutt-gart, Germany. Credit: Paul Wagner

Page 3: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

A more detailed analysis of theoretical system perfor-mance parameters of small telescopes for detection and tracking are described in reference [10]. These include the positional accuracy of object tracking, the limiting magnitude, and the information rate of the system. Design parameters are defined which describe optimal system performance. The defined metric for system FOV is the instantaneous field of regard, IFOR, and is defined as,

where a larger value follows a larger FOV and therefore cov-ers a larger orbital volume. The limiting magnitude mv of such a system is defined as,

where Φ0 is the Irradiance of magnitude zero object, mRSO the number of pixels occupied by the RSO, and eb the elec-trons by the background sky irradiance per pixel.

The information rate shows that the detection perfor-mance does not only depend on the sensitivity but also on the FOV, pixel scale and number of crossing RSO. This metric will not be analyzed in more detail but shows the dependencies of the system performance in an analytical way. The metric for information rate is depending on the density of RSO per deg2 nRSO, the FOV, the information objective JI, and the probability of successful RSO detection P(ΓRSO+n > SNR σn):

where the information objective JI is defined as,

The metric JI gives the relative amount of information in a single observation and is inversely proportional to the uncertainties.

Further details about theoretical system performance met-rics can be found in ref. [10]. For the system under consid-eration, Table 1 lists the simulation parameters and results. Due to a small focal length compared to tracking system [10], the information metric is small; therefore, the IFOR is considerably larger and allows observing a larger orbital volume. The minimum detectable RSO diameter is rather small compared to the limiting magnitude of this system.

(2)IFOR =

(ypx

f �

)2

.

(3)mv = −2.5 log10

⎛⎜⎜⎜⎝

SNR

�√mRSO �R f �

�eb+eread

Φ0�atm�opt�D2

4QE

√ypx

⎞⎟⎟⎟⎠.

(4)JF =nRSO FOV JI P

(ΓRSO+n > SNR �n

)texp

,

(5)JI = log10

(8

(f �

ypx

)2).

3 System details

The system consists of an imaging camera with a lens, a GPS receiver for time synchronization, a computer for image recording and processing, a weather station, and a weather-proof housing. Latest observations were performed on top of the roof of our office building in Stuttgart Vaihingen. The core system is already operational with a camera, lens and a notebook, which makes it easy to set up and operate almost everywhere. To avoid any maintenance and manual operation, the system was extended with a camera and lens mounting for reliable camera pointing, and a housing for environmental protection. Environmental data measured by the weather station are used to toggle camera acquisition. Figure 2 below shows the location and connection between the components.

3.1 Camera

The system is based on passive optical measurements and the basic components are a camera and a lens to per-form angular measurements. As camera we are using a large area CCD (charge coupled device) imaging sensors,

Table 1 Simulation parameters and results of the APPARILLO star-ing system

Properties Symbol Value

Pixel size ypx 9 µmFocal length f’ 0.2 mDiameter of the aperture Dopt 0.1 mTransmission of the optical system τopt 0.75Transmission of the atmosphere τatm 0.7Quantum efficiency QE 0.55Signal-to-noise ratio SNR 4Read noise per pixel eread 15e−

Background illumination Lb 2.38∙1014 ph/s/m2/sr (17.5 mag/arcsec2)

Background electrons eb 15000e−

Exposure time texp 1 sAngular velocity of the RSO ωR 0.5°/sSlant range to the RSO RRSO 1000 kmPhase function of the RSO P(ρ, ψ) 0.08Irradiance of magnitude 0 source Φ0 5.6∙1010ph/s/m2

Number of pixels occupied by the RSO

mRSO 2px

ResultsMinimum detectable RSO diameter dRSO 0.24 mLimiting magnitude of the optical

systemmv 3.5 mag

Instantaneous field of regard IFOR 86 arcsec2

Information metrics JI 9.6

Page 4: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

namely the FLI PL09000, which has the On Smi KAF-09000 CCD sensor. The large pixel size results in a very good dynamic range of 110,000 e−, whereas the readout and dark noise is very low with a total of 15e− per frame. There is also no pattern noise visible, which is typical for CMOS (Complementary metal–oxide–semiconduc-tor) sensors. The image sensor diagonal is 51.7 mm, and the resulting FOV is listed in the next Sect. 3.2. The fol-lowing Table 2 lists more camera specifications with the settings used for observations.

The downside of these cameras is their relatively slow shutter with a total opening time of 54 ms and closing time of 52 ms [12]. Another downside is the slow image readout and transfer speed, which results from the high resolution in combination with the CCD read out archi-tecture and the USB 2.0 interface. This limits their use to long exposures only and the maximum image recording

frequency with our soft- and hardware was 0.2 Hz using binning of 2 (2px × 2px) and 0.1 Hz using full resolution.

3.2 Lens

The tradeoff between different lens choices is between a large aperture and short focal length. A short focal length results in a larger FOV which covers a larger orbital vol-ume. A larger aperture diameter results in a higher sen-sitivity of the system and therefore better detectability of smaller (fainter) objects.

To keep system size and cost small we decided to use a common single-lens-reflex (SLR) medium telephoto lens. These are commercially available, have a very good image quality across a large image circle (even beyond their designed 43 mm) and are affordable (compared to their alternatives). Table 3 below lists the specifications of lens used during latest observations, including the

Fig. 2 Schematic view of APPARILLO system com-ponents and how they are connected. Camera and lens are located in the sensor head and the microcontrollers and PC are located in the cabinet. The core components are marked in red and secondary hardware in blue

Table 2 Camera specifications and exposure settings used by the APPARILLO system [11, 12]

Properties FLI PL09000

Sensor On Semi KAF-09000Sensor size: width x height, diagonal 36.8 mm × 36.8 mm, 51.7 mmPixels 3056 × 3056Pixel count 9.3 MpxPixel size 12 µmFull well capacity 110 000e−

Typical system noise (@8 MHz read out speed) 15e−

Dark noise (@− 35 °C) 0.02e−

Shutter Mechanical blade shutter (Uniblitz CS-65)

Common exposure speed 1 sCommon binning setting 2Read out rate 8 MHzTransfer interface USB 2.0

Page 5: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

resulting angular properties with the camera given in pre-vious Sect. 3.1. (see Table 2).

A picture of the camera and lens mounted inside the weather-sealed housing can be seen in following Fig. 3 (more details follow in Sect. 3.5.).

For this lens, the image quality is very good across the entire image frame resulting in a point spread function of a star covering only a few pixels, see Fig. 4. The lens show an illumination fall off to the image corners, sample images are

shown below (e.g. Figure 6). The vignetting due to the small FOV results mainly from the optical construction where the entrance pupil is obscured by lens element borders, which is typical for lenses with a small focal ratio (f#) [14]. But even without this, vignetting will be present due to the cosine fourth power law. A degradation of the image quality by the window could not be measured/observed, Fig. 4 con-tains two cropped images, one from the image center and the other from the outer corner of the image. It can be seen that the stars are recorded as symmetric points across the entire image and measure about 1.5px in FWHM.

3.3 GPS synchronization

For GPS synchronization, an Arduino-based GPS timer was developed, it consists of an Adafruit Ultimate GPS receiver and an Arduino Uno microcontroller [15]. It can be used to record the UTC timestamp of an incoming TTL pulse from the camera. The microcontroller compares the incoming TTL pulse from the camera with the PPS signal provided by the GPS receiver. This way, it measures the time of a TTL signal with 100 µs of precision [16].

3.4 Weather station

As weather station the Diffraction Limited, Boltwood Cloud-sensor II was used. The weather station records the tem-perature difference between sky (using an infrared sensor) and ambient temperature, the light level, and if it is raining. This information is used to verify clear sky, darkness, and absence of precipitation, respectively. Following conditions are calculated from the weather data:

• Clear sky: is True, if difference between sky and ambient temperature is smaller than < − 28 K.

• Darkness: is True, if light level is smaller than 10 a.u.• Dry: is True, if rain equals 0.

3.5 Weather‑proofed housing

The weather-proofed housing was developed by Raymet-rics1 and is an adaption of a wind LIDAR system. Its IP68 rating protects the equipment from the environment. The head is weather-sealed and has a viewing window. Blowers in front of the window as well as a thermoelectric cooler (TEC) prevent condensation on the window and controls the temperature and humidity, this ensures that each component runs within their specified operating conditions. The sen-sor head contains the camera (shown in Sect. 3.1) and lens

Table 3 Lens properties of the lens used during observation cam-paigns [13]

Properties Canon EF 200 mm f/2 L IS USM

Bayonet Canon EFFocal length 200 mmf# 2Aperture diameter 100 mmFOV: width x height/diagonal (@

FLI)10.4° × 10.4°/14.5°

Pixel scale (@PL09000) 12.4arcsec/px (60µrad/px)

Fig. 3 The FLI PL09000 Camera and Canon EF 200 mm f/2 lens are mounted inside the sensor head. A rail on both sides allows mounting the lens and camera. In the back of the head there is the TEC (top of the image) and blowers in front of the window (bottom of the image) prevent it from condensation

1 Raymetrics S.A., 32 Spartis Str, Metamorfosis, GR-14452, Athens, Greece, www. rayme trics. com.

Page 6: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

(shown in Sect. 3.2) for recording the images, see Fig. 3. The head is movable between 90° and − 90° in respect to the horizon. It points upwards to zenith (or any other suit-able elevation angle) during observation and moves the head pointing downwards when the image recording is stopped. This protects the viewing window from any precipitation and the sun accidently being focused on the image sensor or shutter blades of the camera.

The main compartment or cabinet is the control unit, which contains the main controller of the enclosure (SMU200), a 1U UPS, and a power distribution panel. These are mounted on a sliding 19″ rack with a 4U free space to accommodate our workstation computer. On top of the enclosure are mounted the Boltwood weather station, GPS antenna, and power supply nit box (which provides DC current to the electronics). Similarly, the enclosure on the head is environmentally sealed and temperature-controlled. Figure 5 shows the housing and its components with the sensor head pointing upwards.

3.6 Image recording

Image recording is performed with OOOS (Orbital Objects Observation Software), it has been developed by our depart-ment for satellite laser ranging (SLR) activities and is highly modular [17]. The software records the images taken by the camera, includes information about observation location, observation line-of-sight (LOS), UTC timestamp from our GPS image timer and meta-data of the camera settings and optics used.

A module named staring daemon handles the automatic image acquisition depending on data provided by the weather

station and current time. The setup without any optical filters in front of the lens allows observations with a sun elevation of about -6° and lower to the horizon. This time is automati-cally calculated and image acquisition is started or stopped depending on the time only if no weather data are available. The sensor head is moved upwards pointing towards the sky and image acquisition is started, if all of the weather condi-tions (shown in Sect. 3.4) and time constrains are satisfied.

Fig. 4 Cropped image of stars recorded by the camera from the center (left) and bottom right corner of the image (right)

Fig. 5 Image of the Raymetrics enclosure during observation just after sunset

Page 7: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

If one of the conditions is not satisfied, the image acquisition is stopped and the head moves downwards. If the weather station data are not available, all conditions are considered as satisfied and image recoding is started and stopped based on the computers system time only. It requires the image processing to handle bad images. More on image processing, astrometric calibration and TDM export of tracklets will be described in following chapter 3. When image recording is started by the sensor, astrometric calibration is performed regularly to determine the exact pointing direction of camera system. Typical sample images are shown in Fig. 6.

4 Software structure

The image recoding is managed by the so-called Staring Daemon, which is a Python 3 program based on the OOOS software package [17]. The Staring Daemon handles the connected hardware:

• Weather station,• GPS Timer (Arduino microcontroller),• Camera,• Enclosure (Raymetrics),

starts the image acquisition and handles the data export.

4.1 Staring Daemon

All parts of the software are separated, especially the hard-ware interfaces are worth mentioning as this follows that the software is not hardware-bound, allowing the user or system

designer to select hardware independently. The weather sta-tion is controlled by the Environment Daemon and records the weather information. The camera is controlled by the Acquisition Process, which is also connected to the GPS timer. The Staring Daemon itself is connected to those pro-cesses and daemons using high-level commands. It reads the recorded weather data and toggles the image recording and sensor head position depending on the weather conditions and time. The Staring Daemon also handles the commu-nication to the image processing program and hands over information and measurements to be uploaded to our website by the Internet Daemon. Following Fig. 7 gives an overview of the external programs (Space Debris [18], Astrometry.net [20]), OOOS Daemons, OOOS processes and data transfer between those.

4.2 Image processing

Image processing is performed by a separate program writ-ten in C++. The software was developed in cooperation with Kormoran Technologie GmbH2 and is simply called “Space Debris” [18]. It reads in the recorded FITS image files and uses OpenCV [19] algorithms to process the images. A major challenge for the processing is the presence of clouds in the images, see Fig. 8 below. Combined with stray light from artificial light sources, this caused too many false-pos-itive detections previously [3]. For autonomous operation, the new software was included and false-positive detection

Fig. 6 Two-sample images of the APPARILLO staring sensor taken during observations on December 11th at 4:50 am and 5 am

2 Alte Poststraße 24, 88,690 Uhldingen-Mühlhofen, Germany, www. kormo ran- tech. com.

Page 8: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

could be rejected completely during our campaigns in December 2020.

To detect objects in the images, the background intensity profile is determined by filtering high frequencies from the image first. The result is subtracted from the original image to remove the background. This process is performed itera-tively which removes clouds or other intensity variations. The image can be binarized to separate the objects (e.g. stars or streaks) from the background. Stars and streaks are dis-tinct by their size and shape. Pixel coordinates of the stars and streaks are measured subsequently. The star positions are used to perform astrometric calibration, which allows converting the streak coordinates into equatorial coordinates.

As astrometric software the engine of the Astrometry.net project is used [20], installed as a separate program. Up to this stage, all calculations are done separately for each image.

4.3 Data export

In the next step, the equatorial coordinates of the streaks observed in several different images are combined into traces by the angular velocity and direction in the sky. Finally, a straight line is fitted with constant velocity into the observed data. This way we obtain a circular fit of the measured coor-dinates to remove any outliers. The results are written into

Fig. 7 A schematic illustration how the different sub-processes (purple) and hardware interfaces (blue) are connected to the main program “Staring Daemon” (green). Data in- or output is shown in dashed lines and external software programs are marked in (red)

Fig. 8 Sample image of the staring sensor of partly cloudy sky (left). The software is designed to handle clouds and still detect orbital objects recorded as streaks (right). As long as there is a dozen of stars

detectable to perform astrometric calibration, the measured streak position can be converted to equatorial coordinates

Page 9: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

a “Tracking Data Message” (TDM) file [6] which allows data sharing with other stations or databases. These uniden-tified TDM files are uploaded directly to our web database using the Internet Daemon, which delivers the connection for interchanging the data to a separate subsystem. Chapter 4 shows the resulting data of the first unsupervised campaign of the fully autonomous staring system.

5 Results

Compared to manual operation [3], the system now can use every minute to observe when conditions turn good. We operated the system constantly between November 20th and December 23rd. The weather conditions covered classical German winter weather, including storm, rain, fog, frost and snow. The conditions were far from optimal, making it a worst-case scenario for the system. During the campaign, APPARILLO was placed on top of the roof of our office building as shown in Fig. 1. The observation direction was fixed in the horizontal reference frame over the entire cam-paign. Zenith was chosen as LOS because it should give the best performance [9]. The system settings, properties and geodetic coordinates are listed in Table 4.

Figure 9 shows a series of 89 images combined to show how streaks are recorded by the system. It contains low altitude LEO RSO as long streaks, high-altitude LEO RSO as short streaks and rotating RSO with visible intensity variations.

Two more samples of a combined series of images con-taining a single object in the FOV are shown Fig. 10 below.

The cloud sensor does not prevent the occurrence of clouds in the images. Especially transparent high clouds were often recorded, see Fig. 11.

The image processing provides a lot of parameters to be adjusted which affects sensitivity, false-negative and false-positive detection rates. The settings are chosen to have zero percent false-positive detections. During latest campaign, this requirement was fulfilled, but due to very few air traffic during that time (caused by the COVID-19 pandemic), no aircraft crossed the LOS. This is to current knowledge the only case the system might falsely detect an RSO.

However, false-negative detections were caused from faint objects or rotating RSO with large-intensity variations (like shown in Fig. 10, right). These intensity variations result in higher momentum of the grayscale streak which causes a false-rejection. This kind of exclusion is imple-mented because of background stars that coincide with the streak induce misplaced detections. Currently, we cannot provide an exact number of the actual false-negative rate as the human observer shows large variations in detect-ing streaks from a stack of images which range up to 8500 images per night. In a manual review of 15% of the data, a false detection rate of 4% was observed. These were the number of streaks which a human could detect, but not the image processing. It should be noted that the human on the other hand missed about the same number of streaks which the software did detect properly.

Table 4 APPARILLO configuration and settings during December 2020 campaign (20.12.2020–23.12.2020). The LOS was determined by astrometric calibration and the geodetic position derived from the GPS timer

Parameter Value

Camera FLI PL09000 (see Table 2)Lens Canon EF 200 mm f/2 L IS USM (see

Table 3)FOV 10.4° × 10.4°Diameter of aperture 0.2 mExposure time 1 sFrame rate 0.2 HzBinning 2Pixel scale 0.07°/px (24.8arcsec/px or 120µrad/px)Line-of-sight (LOS) Azimuth 144° Elevation 88°

Geodetic coordinates Latitude 48.74885° ± 5e-05° Longitude 9.10257° ± 4e-05° Altitude 486 m ± 6 m Fig. 9 89 images merged to show objects detected by the staring sys-

tem on November 26th. 10 Objects passed the FOV over a duration of 13.5 min between 04:57:18 and 05:04:44

Page 10: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

For each detected object, the corresponding images were reviewed to validate the measurements. Next, the recorded TDM data are analyzed to review the performance of the system in more detail.

5.1 Detection statistics

During the campaign, the system was running continuously without any interruptions. The following Table 5 lists how many objects (TDM files) were recorded each night. Fur-thermore, the amount of images recorded, the number of identified objects using the CelesTrak TLE catalog (more details follow in the end of this section) and the resulting

Fig. 10 Left: Cropped image showing SL-12 Rocket body (NORAD: 24829) as streaks passing right in front of the galaxies M81 and M82 (upper right), 8 imaged were merged taken on 2020-12-17 between

03:09:00:569 and 03:09:36:601. Right: Intensity variations recorded from GLOBALSTAR M015 (NORAD = 25308), here 12 images were merged recorded on 2020-11-26 between 03:29:21 and 03:30:02

Fig. 11 Two images showing a high cloud layer and a LEO RSO in each image. Left: STARLINK-1312, (NORAD = 45398) at December 17, 2020 06:12:06:970. Right: USA 25 (NORAD = 18025) at December 11, 2020 05:52:39:514

Page 11: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

unidentified objects are listed. An object was correlated with the available TLE predictions [4] and considered as identi-fied when all measured coordinates divert less than 1° from the prediction. Prediction uncertainties of TLE have been reported being as large as 517 m for in-track and 137 m as average residuals across the catalog for LEO [21]. Consid-ering a 3σ limit, this results in a maximum angular radius of 0.15°.

Compared to former observation campaigns from 2015 [3], the number of unidentified objects has been reduced by about 1/3 (15% vs. 23%). Former observations were per-formed manually during good conditions only. These condi-tions were: good weather when the night sky was clear and

when LEO RSO are illumined by the sun while the observer is in the earth’s shadow (at night). This is about 1 h after sunset for 4 h and 5 h before sunrise to 1 h before sunrise. Current observations are performed across the entire night depending on the weather conditions, which results in no detection around midnight but lot of detections after sun-set and before sunrise. Following Fig. 12. shows the hourly detection rate of all nights between November 20th to December 23rd. Days without any detections are excluded. The detection rate is shown over the time of the day in 1-h steps.

During good conditions, the detection rate is about 30 objects per hour which is 20% higher than previously

Table 5 Identification statistics of detected objects during observation campaign in December 2020, see Table 4 for system settings

Date Detected RSO (TDMs)

Identified TLE RSO

Unidentified RSO

Percentage of unidenti-fied RSO (%)

Images

20.11.2020 40 31 9 22.50 386621.11.2020 0 0 0 0.00 022.11.2020 0 0 0 0.00 023.11.2020 17 17 0 0.00 151724.11.2020 0 0 0 0.00 18925.11.2020 64 55 9 14.06 786826.11.2020 60 52 8 13.33 649627.11.2020 33 22 11 33.33 339628.11.2020 52 42 10 19.23 245529.11.2020 26 21 5 19.23 539930.11.2020 42 31 11 26.19 475501.12.2020 0 0 0 0.00 4202.12.2020 0 0 0 0.00 303.12.2020 65 60 5 7.69 215804.12.2020 0 0 0 0.00 254805.12.2020 0 0 0 0.00 230106.12.2020 0 0 0 0.00 427007.12.2020 6 5 1 16.67 121808.12.2020 3 2 1 33.33 40909.12.2020 0 0 0 0.00 13710.12.2020 43 39 4 9.30 105211.12.2020 0 0 0 0.00 25212.12.2020 0 0 0 0.00 513.12.2020 69 60 9 13.04 551614.12.2020 1 0 1 100.00 127415.12.2020 4 2 2 50.00 162816.12.2020 133 114 19 14.29 736917.12.2020 3 3 0 0.00 389318.12.2020 138 117 21 15.22 854619.12.2020 7 7 0 0.00 31720.12.2020 20 20 0 0.00 190621.12.2020 0 0 0 0.00 6922.12.2020 5 4 1 20.00 189223.12.2020 0 0 0 0.00 74Sum 831 704 127 15.28 82,820

Page 12: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

measured (24.2 obj/h) [3]. This is due to the larger popula-tion of LEO RSO compared to 2015. The largest measured detection rate was 36 objects per hour. It can be seen that the system can only detect object during the terminator phase (during twilight hours). These are between 2 am UTC to 6 am UTC in the morning and 4 pm UTC to 7 pm UTC, round midnight (11 pm UTC), the system cannot detect any RSO. It should be noted that observation conditions were various during the campaign, but the system was capable of recording objects once there was a gap in the clouds, which was the case on December 16th and 18th in the morning or in the afternoon of December 13th, 19th and 22nd. Follow-ing Figs. 13, 14 show more details of the measured weather conditions and the corresponding detection rates for each day in a separate graph. In each graph, there are 4 sepa-rate rows showing the weather conditions: dry (violet)/rain (black), darkness (magenta)/light (black), clear sky (orange)/cloudy (black) and if the observation was started (yellow). E.g. it shows on which time the observation was started as a yellow bar and the resulting detection rate is shown on the right y-axis in blue depending on the time of that day. The presented data cover only 20 days in this article due to space constrains.

The system performed very reliable in judging the weather conditions and observations were started automati-cally. When weather conditions were good during the termi-nator phase, the detection rate went up to 36 objects per hour for the 14th of December but mostly ranged between 25 and

30 objects per hour. The only unexpected behavior, which we observed, was wrong recording of the delta sky to ground temperature when the weather sensor became wet. This can be seen as fine lines (orange) in the Clear sky condition in Figs. 13, 14, which caused the system to start observation falsely for a few minutes. Remarkably, the image process-ing handled those situations without a single false-positive detection. To get more details on the system performance, detected objects need to be identified. These results are pre-sented in following Sect. 5.2.

5.2 Size of detected objects

All detected objects were compared with CelesTrak’s TLE [4] and SATCAT (Satellite catalog) [22] catalog. The TLE catalog is operated and maintained by the NORAD (North American Aerospace Defense Command) and contains pub-licly available predictions of RSOs. The SATCAT catalog contains supplementary information, like RCS (Radar Cross Section) or launch date of those RSO. All 823 detected RSOs are compared with every object in the TLE catalog to identify it and the SATCAT is used to obtain the RCS if available. The SATCAT does not provide information of all 704 detected TLE objects, which reduces the amount of data to be analyzed to 680. The detection statistics grouped by their RCS is shown in following Fig. 15 as a histogram and in relation to the (predicted) objects distance during observation in Fig. 16.

Fig. 12 Hourly detection rate of all observations between November 20th and December 23rd 2020, see Table 4 for system settings. Days with-out any detections are excluded

Page 13: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

Figure 15 shows that objects with an RCS in the range between 0.1 and 20 m2 were detected. And the number of detected objects is peaking around 1 and 2.2 m2. As the total number of RSO increases with smaller object size, this indicates that the number of undetected objects rises

with objects smaller then RCS 2 m2. In Fig. 16, it can be seen that there is no direct correlation between RSOs RCS or range and the radial angular displacement of the measurements to the predicted positions. The residual deviation of the measured angles of a single object also

Fig. 13 Measured weather conditions and the corre-sponding detection rate of the APPARILLO system between December 3rd to 12th. Each of the 10 days is shown in a separate row. Within each the conditions: Dry (purple), Darkness (magenta), Clear sky (orange) and Observation started (yellow) are shown in four separate rows over the UTC time of the day. If any condition is not satisfying it is shown in black. On the right axis, the hourly detection rate is shown in blue over the time of the day, see Table 4 for system settings

Page 14: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

Fig. 14 Measured weather conditions and the corre-sponding detection rate of the APPARILLO system between December 13th and 23rd. Each of the 11 days is shown in a separate row. Within each the conditions, Dry (purple), Darkness (magenta), Clear sky (orange) and Observation started (yellow) are shown in four separate rows over the UTC time of the day. If any condition is not satisfying, it is shown in black. On the right axis, the hourly detection rate is shown in blue over the time of the day, see Table 4 for system settings

Page 15: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

Fig. 15 Histogram of detected and identified RSOs grouped by their RCS according to SATCAT [22]. Data taken between November 28th and December 23rd, see Table 4 for system settings

Fig. 16 RCS according to SATCAT [22] and range according to TLE predictions [4] of each detected and identified objects as a circle. The color of each circle correlates with the mean radial angular deviation

of the angular measurements to the predicted positions (according to TLE prediction) of this detected object. Ranging from zero (purple) to 0.05° (yellow). See Table 4 for system settings

Page 16: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

does not correlate with RCS or range (angular velocity). In Sect. 5.3, the angular measurements are analyzed in more detail.

Furthermore, the detected object can be compared to the cataloged NORAD objects which crossed the FOV during observation. This is also performed using the TLE orbital data and allows to calculate the detection efficiency to all cataloged objects. The detection efficiency gives a figure of merit how good the system can detect certain object size. Following Fig. 17 shows a histogram of detected objects in red and crossed objects in blue according to TLE predictions [4] for different RCS [22]. The bottom half shows the result-ing detection efficiency.

Figure 17 shows that the detection efficiency for objects with an RCS larger than 1 m2 is about 50%. Even though objects with an RCS as small as 0.1 m2 were detected, the detection efficiency is effectively 0% due to the large popula-tion of smaller RSO.

ESA’s DISCOS catalog [23] allows to obtain physical properties and optical cross section (OCS) of RSOs. The catalog does not provide data of every identified object, but for 410 RSO, the data were available. This allowed evaluat-ing the system’s detectability to RSO dimensions, OCS and mass. In the following, all objects were sorted ascending to their volume. Figure 18 shows the volume and mass, Fig. 19 the dimensions (length, height, depth), Fig. 20 range, apogee

and perigee and Fig. 21 compares the RCS to OCS for each object. Additionally, a histogram shows the distribution of each dataset on the y-axis of the graph.

The detected RSOs range between 40 kg and 4t in mass and between 0.045 and 550 m3 in volume. The dimensions of each objects are shown in Fig. 19. Except the smallest objects (Transit 12 (NNS O-8), NORAD 2119, COSPAR: 1966-024A), the dimensions are in the same order of mag-nitude. As a general rule, the system mostly detects objects with a size of a dish washer and larger.

The majority of detected objects measure a few meters in size. The satellite with the smallest volume meas-ures 30 m in depth. Hence, his OCS is larger than the next larger objects, which measure about 1 m in each dimension.

Following Fig. 20 shows the apogee, perigee and cal-culated range from the TLE predictions of each identified object, to show if the detectability correlates with the orbit apogee and perigee.

It can be seen that the detected size does not depend on the range to the system. All the objects are detected around 1000 km of range, with a few objects as low as 550 km and up to 4500 km. With a few exceptions the orbits are mostly circular.

The DISCOS catalog also provides a range of OCS values, namely minimum OCS, average OCS and maximum OCS. This represents how the OCS varies as many factors have

Fig. 17 Top: Histogram of detected RSO by APPARILLO (red) and crossing TLE objects (blue) according to CelesTrak NORAD TLE predictions grouped after RCS. The crossing RSOs are calculated when the system performed observations, if no observation was

started crossing TLE objects are not considered. Bottom: And the corresponding detection efficiency (black). The objects were recorded between November 20th and December 23rd, see Table 4 for system settings

Page 17: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

an impact on the RSO brightness, like phase angle, material, shape, and size. The following Fig. 21 compares the RCS with these three OCS values for each identified object.

The data show that in general, the OCS, RCS, and dimensions are correlated with the OCS being more often larger than the RCS. And the RCS and OCS are similar to the actual dimensions (volume) of each RSO, compare

Fig. 18 For every object sorted ascending to its volume, the volume (blue) and mass (purple) are shown, according to DISCOS [23]. The corresponding mass distribution is shown on the right y-axis and vol-

ume distribution on the left y-axis. Data are taken between November 20th and December 23rd, see Table 4 for system settings

Fig. 19 The dimensions of every identified object, sorted ascend-ing to its volume. The RSO length (green), height (cyan) and depth (orange) is shown left and the corresponding distributions of detected

objects is shown for each dimension in the same colors on the right y-axis. Data are taken between November 20th and December 23rd, see Table 4 for system settings

Page 18: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

Fig. 21 with Figs. 18, 19. Thus, the RCS statistics shown in Fig. 17 is a good way of illustrating system perfor-mance and allows to estimate the dimensions of detected objects.

Next Sect. 5.3 evaluates the angular measurements of the detected and identified objects in more detail.

Fig. 20 The apogee (green), perigee (red) and range (blue) of very object, sorted ascending to their volume. On the right, the corresponding dis-tribution is shown using the same colors. Data are taken between November 20th and December 23rd, see Table 4 for system settings

Fig. 21 Comparison between RCS (red) obtained from the SATCAT [22] and OCS (green) obtained from DISCOS catalog [23]. The minimum (dark green) and maximum OCS (light green) is shown as

an error bar. On the right y-axis, the corresponding distribution of each dataset is shown using the same colors. Data are taken between November 20th and December 23rd, see Table 4 for system settings

Page 19: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

Fig. 22 Equatorial positions of detected RSO by the staring system (red cross) and the corresponding TLE object (blue circle). The size of the markers shows the relative angular displacement between measurement and prediction. Data taken on December 10th, see Table 4 for settings

Fig. 23 Equatorial positions of detected RSO by the staring system (red cross) and the corresponding TLE object (blue circle). The size of the markers shows the relative angular displacement between measurement and prediction. Data taken on December 13th, see Table 4 for settings

Page 20: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

5.3 Angular measurements

The available TLE prediction data are also used to compare recorded angular position with predictions. Two-sample

visualizations from a random subset of detected RSO are shown in Figs. 22, 23. They each show the measured equa-torial coordinates of about two dozen RSO and its corre-sponding predicted position calculated using TLE data [4]

Fig. 24 In-track angular displacement between measured object posi-tions and their TLE predictions as a distribution for each object. The color corresponds to the mean angular distance of each object to the

TLE prediction (not just in-track) from small values colored in blue to large values colored in red

Fig. 25 Cross-track angular displacement between measured object positions and their TLE predictions as a distribution for each object. The color corresponds to the mean angular displacement of each

object to the TLE prediction (not just cross-track), from small values colored in blue to large values colored in red

Page 21: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

from that day. The selected range covers the entire RA-Dec range observed that day.

The displacement between measurement and predictions shows a constant offset for every object measured. The amount and direction of the angular offset vary between objects. To analyze the angular displacements, the angular displacement is separated into the in-track and cross-track angular displace-ment. Following Figs. 24, 25 show a subset of the in- and cross-track angular displacement distributions between meas-urement and TLE prediction, respectively, for single objects.

It can be seen that each single distribution is offset and the offset is mostly larger than the deviation of the angular measurements of a single object. In general, the displace-ment offsets from all objects scatter around zero. For most measurements, the displacement is equally distributed, but other shapes are visible. Both in- and cross-track errors (offset and deviation) are in general equally large for each object. The median angular deviation from all measured objects is just 0.0041°, but the angular deviation varied from − 0.07° up to 0.05°. The constant offset which was

Fig. 26 Histogram of angular in- and cross-track displacement between measured object positions and their TLE predictions for all measured and correlated RSO between 20th of November and 23rd of December 2020

Fig. 27 Time delay (orange) of the light received from an RSO and corresponding angular error (blue) depending on its altitude above ground. Circular orbits are assumed to calculate the time delay and angular error

Page 22: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

observed between measurements and TLE predictions ranged up to about 0.1°.

Figure 26 shows the histograms of both in- and cross-track angular displacement for all measurements taken by APPARILLO during the December campaign. It can be seen that both displacements scatter around 0, which indicates that there is no systematic error causing the uncertainties. The expected residuals from the TLE predictions are larger in-track than cross-track (0.049° vs. 0.013°) [21]. But the observed displacements show very similar distributions, which is why the observed angular cross-track displacement is a measurement error by the system.

Due to the travel time of the light and the velocity of the RSO, an angular aberration is present. By the time the exposure time is synchronized, the object has already moved and Fig. 27 shows the time delay and corresponding angular error for our system depending on orbit height (assuming a circular orbit). For the observed objects, the angular aber-ration by RSO’s velocity ranges from 0.0012 to 0.0015°, and is not causing the observed displacement offsets of the distributions. Additionally, no systematic error radial from the image center could be observed, which shows that that lens distortion is not causing the angular error.

Due to large uncertainties of the TLE prediction which ranging from 0.01° for small inclinations in LEO up to 0.05° for large inclination in LEO [21], these results do not give the resulting angular in- and cross-track precision. The mean angular standard deviation of the angular measurements of an object was 0.0041° and showed that streak positions are measured with sub-pixel accuracy. Unfortunately, the mean angular displacement to the prediction is rather large with about 0.05°, and can be considered as first approximation of

the angular precision. This is about 7 × larger than the pixel scale of the system (0.007°).

It requires more precise orbit predictions, to determine the absolute precision of the system. Unfortunately, only two of the detected objects had more precise CPF [24] predic-tions available. For those two, the in- and cross-track angular displacement distributions are shown in Fig. 28.

Both objects show a similar mean angular displace-ment as observed previously (Figs. 24, 25), which is larger than the deviation of the distribution with 0.02° (Jason-3, NORAD: 41240, COSPAR: 2016-002A) and 0.05° (SARAL, NORAD: 39086, COSPAR: 2013-009A) between measured and predicted positions. The standard deviation of the angular measurement is 0.003° for Jason-3 and 0.006° for SARAL. For the majority of objects, this was observed in the analysis using TLE predictions previously (Figs. 24, 25). These deviations are up to 7 × the systems pixel scale of 0.07°/px (see Table 4) and we can conclude that the esti-mated system precision is about 0.05° or 7 × the pixel scale. But, these are far too few measurements to conclude the finial precision of the staring system.

6 Conclusion

A fully functional staring system was presented which reli-ably operated between November 20 and December 23, 2020 even under harsh winter weather conditions. The system operated 24/7 during this observation campaign and no false-positive detections were evaluated, which was the major design target of the newly developed image pro-cessing. Under optimal observation conditions, the system detected 823 LEO objects within this period with a detection

Fig. 28 In-track (blue) and cross-track (red) angular dis-placement distribution between measured object positions and the predicted position using CPF prediction files [24]. The first object is Jason-3 (NORAD = 41240) detected on December 18th and the second SARAL (NORAD = 39086) detected on November 28th

Page 23: APPARILLO: a fully operational and autonomous staring ...

APPARILLO: a fully operational and autonomous staring system for LEO debris detection

1 3

rate up to 36 objects per hour. Multiple detections within one frame can be handled without any constrains. About 15% of observed objects could not be identified using the TLE database [4]. This demonstrates that the presented system is capable of detecting unknown objects and can effectively support subsequent handover to existing database or pro-cessing pipelines in form of the TDM format. The detection performance was evaluated using available SATCAT’s RCS and DISCOS data of identified objects. In Figs. 15, 16, it was shown that objects smaller than 1 m2 RCS can already be detected, which is very good performance for such a small system. However, the amount of detections smaller than 0.8 m2 RCS remains below 10%, see Fig. 17. The smallest objects detected were as small as 0.2 m2 in OCS, or 0.25 m2 in RCS (see Fig. 21), or 0.045 m3 in Volume (see Fig. 18). This is about 2 × larger than theoretical predictions (see Table 1).

The mean angular deviation of a measured object was 0.0041° with a mean angular distance of about 0.05° to the predicted positions. Comparison with precise CPF predic-tions confirmed this numbers and confirmed that the preci-sion of the system can be estimated as 0.05° or 7 × the pixel scale.

Angular data quality will further improve with improved time synchronization of the image exposure times and veloc-ity aberration correction. Improvements in the image evalu-ation algorithms should be able to suppress the majority of the false-negative detections to approach near unity detec-tion efficiency, by implementing more sophisticated image processing algorithms like presented by Vananti et al. [25]. A different weather station is recommended to eliminate false clear sky identifications and short image recording during cloudy conditions.

The modular and COTS approach allows simple repro-duction of this sensor which is easy to operate within any space surveillance network. This allows to easily adopt the system on future or costumer needs and extend by latest developments like processing techniques. Synthetic tracking is worth to mention as it already showed promising results [26] and is close to real-time processing time with latest con-sumer GPUs. This will allow an order of magnitude higher sensitivity using the same optical components. This system is the base for future developments to extend space surveil-lance networks with a small low-cost sensor. Stare and chase with subsequent laser ranging allows immediate cataloging capabilities and was already demonstrated [27].

Acknowledgements The concept of passive optical staring in com-bination with subsequent laser ranging was protected under the util-ity patent DE 20 2017 101 831 U1 (System zur Bestimmung und/oder Vorhersage einer Position und/oder einer Flugbahn von orbitalen Objekten im Weltraum. Gebrauchsmuster) on 29.03.2017 [8]. Many thanks to Thomas Hasenohr [28] and Carina Kießling [29] who helped

to develop the system during their thesis. All analysis was done using Python scripts and shown graphs are plotted using matplotlib library.

Funding Open Access funding enabled and organized by Projekt DEAL.

Open Access This article is licensed under a Creative Commons Attri-bution 4.0 International License, which permits use, sharing, adapta-tion, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/.

References

1. Krag, H., Setty, S.J., Di Mira, A., Zayer, I., Flohrer, T.: Ground-based laser for tracking and remediation—an architectural view. IAC-18-A6.7.1. Proceedings of the 69th international astronau-tical congress (IAC) (2018)

2. Bamann, C., Hugentobler, U.: Accurate orbit determination of space debris with laser tracking. Proc. 7th European conference on space debris, Flohrer, T., Schmitz, F.(eds.). (2017)

3. Wagner, P., Hampf, D., Riede, W.: Passive optical space sur-veillance system for initial LEO object detection. In: Proceed-ings of 66th international astronautical conference, IAC-15.A6.3x28491 (2015)

4. CelesTrak: NORAD Two-Line Element Sets. www. celes trak. com/ NORAD/ eleme nts/ (2020). Accessed 5 Dec 2020

5. Hasenohr, T., Hampf, D., Wagner, P., et. al.: Initial detection of low earth orbit objects through passive optical wide angle imag-ing systems. Proc. of Deutscher Luft- und Raumfahrtkongress. https:// www. dglr. de/ publi katio nen/ 2016/ 420008. pdf (2016). Accessed 25 June 2021

6. CCSDS: Tracking Data Message—RECOMMENDED STANDARD CCSDS 503.0-P-1.1. https:// public. ccsds. org/ Lists/ CCSDS% 20503 0P11/ 503x0 p11. pdf (2018). Accessed 21 December 2020

7. Wagner, P., Hasenohr, T., Hampf, D., Sproll, F., Humbert, L., Rodmann, J., Riede, W.: Detection and laser ranging of orbital objects using optical methods. Proceedings Volume 9977, Remote Sensing System Engineering VI; 99770D (2016)

8. Wagner, P., Hampf, D., Sproll, F., Humbert, L., Hasenohr, T.: System zur Bestimmung und/oder Vorhersage einer Position und/oder einer Flugbahn von orbitalen Objekten im Weltraum. Gebrauchsmusterschrift, DE 20 2017 101 831 U1, https:// paten ts. google. com/ patent/ DE202 01710 1831U1/ en (https:// regis ter. dpma. de/ DPMAr egist er/ pat/ PatSc hrift enein sicht? docId= DE202 01710 1831U1) (2017). Accessed 22 Jan 2021

9. Wagner, P., Hampf, D., Sproll, F., Hasenohr, T., Humbert, L., Rodmann, J., Riede, W.: Passive optical link budget for LEO space surveillance. Technical proceedings of 18th AMOS con-ference, https:// amost ech. com/ Techn icalP apers/ 2017/ Poster/ Wagner. pdf (2017). Accessed 26 Jan 2021

10. Coder, R.D., Holzinger, M.J.: Multi-objective design of optical systems for space situational awareness. Acta Astronaut. 128, 669–684 (2016)

Page 24: APPARILLO: a fully operational and autonomous staring ...

P. Wagner, T. Clausen

1 3

11. Finger Lakes Instruments: ProLine 09000 data sheet. http:// www. flica mera. com/ spec_ sheets/ PL090 00. pdf (2018). Accessed 2 Aug 2018

12. Vincent Associates: Uniblitz CS65 datasheet. https:// www. unibl itz. com/ wp- conte nt/ uploa ds/ 2020/ 05/ unibl itz- cs65- datas heet. pdf (2020). Accessed 5 Nov 2020

13. Preisvergleich Internet Services AG: Canon EF 200mm 2.0 L IS USM weiß (2297B005). https:// geizh als. de/ canon- ef- 200mm-2- 0-l- is- usm- weiss- 2297b 005- a2922 86. html (2020). Accessed 05 Dec 2020

14. Gross, H.: Handbook of optical systems: vol 1—fundamentals of technical optics. Wiley, Weinheim (2005)

15. Hampf, D.: GPS camera timer. (15 Dec 2016) 16. Hampf, D., Wagner, P., Riede, W.: Optical technologies for

observation of low earth orbit objects. Proceedings of 65th inter-national astronautical conference. pp. IAC 14,A6,P,70,x24899 (2014)

17. Hampf, D., Sproll, F., Hasenohr, T.: OOOS: a hardware-inde-pendent SLR control system. In ILRS technical workshop, Riga (2017)

18. Haussmann, R., Wagner, P., Clausen, T.: Streak detection of space debris by a passive optical sensor. Proc. of 8th European confer-ence on space debris (2021)

19. OpenCV team: OpenCV Website. https:// opencv. org/ (2020). Accessed 24 Dec 2020

20. Astrometry.net project: Astrometry.net astrometric engine. http:// astro metry. net/ use. html (2020). Accessed 08 Feb 2020

21. Flohrer, T., Krag, H., Klinkrad R.: Assessment and categorization of TLE orbit errors for the US SSN catalogue. Technical proceed-ings of 9th AMOS conference (2008)

22. CelesTrak: SATCAT online satellite database. https:// celes trak. com/ satcat/ boxsc ore. php (2021). Accessed 23 Jan 2021

23. ESA: DISCOSweb. https:// disco sweb. esoc. esa. int/ (2021). Accessed 29 Jan 2021

24. TU München: satellites tracked by ILRS. https:// edc. dgfi. tum. de/ en/ satel lites/ (2020). Accessed 28 Jan 2021

25. Vananti, A., Schild, K., Schildknecht, T.: Improved detection of faint streaks based on a streak-like spatial filter. Adv. Space Res. 65(1), 364–378 (2020)

26. Zscheile, J., Wagner, P., Lorbeer, R.-A., Guthier, B.: Synthetic tracking for orbital object detection in LEO. Proc. of 1st NEO and Debris Detection Conference https:// confe rence. sdo. esoc. esa. int/ proce edings/ neoss t1/ paper/ 399/ NEOSS T1- paper 399. pdf (2019) Accessed 9 Feb 2021

27. Steindorfer, M., Kirchner, G., Koidl, F., Wang, P., Anto, A., Sánchez, J.F., Merz, K.: Stare and chase of space debris targets using real-time derived pointing data. Adv. Space Res. 60, 1201–1209 (2017)

28. Hasenohr, T.: Initial detection and tracking of objects in low earth orbit. University of Stuttgart, Master thesis (2016)

29. Kießling, C.: Development of digital imaging methods for an autonomous system for detection and mapping of orbital objects. Johannes Gutenberg-University Mainz, Master Thesis (2018)

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.