<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing with OASIS Tables v3.0 20080202//EN" "https://jats.nlm.nih.gov/nlm-dtd/publishing/3.0/journalpub-oasis3.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:oasis="http://docs.oasis-open.org/ns/oasis-exchange/table" xml:lang="en" dtd-version="3.0" article-type="research-article">
  <front>
    <journal-meta><journal-id journal-id-type="publisher">JSSS</journal-id><journal-title-group>
    <journal-title>Journal of Sensors and Sensor Systems</journal-title>
    <abbrev-journal-title abbrev-type="publisher">JSSS</abbrev-journal-title><abbrev-journal-title abbrev-type="nlm-ta">J. Sens. Sens. Syst.</abbrev-journal-title>
  </journal-title-group><issn pub-type="epub">2194-878X</issn><publisher>
    <publisher-name>Copernicus Publications</publisher-name>
    <publisher-loc>Göttingen, Germany</publisher-loc>
  </publisher></journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.5194/jsss-15-27-2026</article-id><title-group><article-title>Recognising wild animals on roads: multisensor systems for accident avoidance</article-title><alt-title>Recognising wild animals on roads</alt-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author" corresp="yes" rid="aff1">
          <name><surname>Schneider</surname><given-names>Michael</given-names></name>
          <email>michael.schneider@thu.de</email>
        <ext-link>https://orcid.org/0009-0006-7624-5436</ext-link></contrib>
        <contrib contrib-type="author" corresp="no" rid="aff1">
          <name><surname>Mantz</surname><given-names>Hubert</given-names></name>
          
        </contrib>
        <contrib contrib-type="author" corresp="no" rid="aff1">
          <name><surname>Walter</surname><given-names>Thomas</given-names></name>
          
        </contrib>
        <contrib contrib-type="author" corresp="no" rid="aff2">
          <name><surname>Montoya-Capote</surname><given-names>Mike</given-names></name>
          
        </contrib>
        <contrib contrib-type="author" corresp="no" rid="aff2">
          <name><surname>Berger</surname><given-names>Jonas</given-names></name>
          
        </contrib>
        <contrib contrib-type="author" corresp="no" rid="aff2">
          <name><surname>Reichel</surname><given-names>Andreas</given-names></name>
          
        </contrib>
        <contrib contrib-type="author" corresp="no" rid="aff2">
          <name><surname>Hollmach</surname><given-names>Nils</given-names></name>
          
        </contrib>
        <aff id="aff1"><label>1</label><institution>University of Applied Sciences Ulm, 89081 Ulm, Germany</institution>
        </aff>
        <aff id="aff2"><label>2</label><institution>tecVenture, 04177 Leipzig, Germany</institution>
        </aff>
      </contrib-group>
      <author-notes><corresp id="corr1">Michael Schneider (michael.schneider@thu.de)</corresp></author-notes><pub-date><day>19</day><month>February</month><year>2026</year></pub-date>
      
      <volume>15</volume>
      <issue>1</issue>
      <fpage>27</fpage><lpage>33</lpage>
      <history>
        <date date-type="received"><day>25</day><month>September</month><year>2025</year></date>
           <date date-type="rev-recd"><day>28</day><month>January</month><year>2026</year></date>
           <date date-type="accepted"><day>29</day><month>January</month><year>2026</year></date>
      </history>
      <permissions>
        <copyright-statement>Copyright: © 2026 Michael Schneider et al.</copyright-statement>
        <copyright-year>2026</copyright-year>
      <license license-type="open-access"><license-p>This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this licence, visit <ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">https://creativecommons.org/licenses/by/4.0/</ext-link></license-p></license></permissions><self-uri xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026.html">This article is available from https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026.html</self-uri><self-uri xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026.pdf">The full text article is available as a PDF file from https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026.pdf</self-uri>
      <abstract><title>Abstract</title>

      <p id="d2e138">Wildlife-related traffic accidents represent a persistent hazard on rural roads in Germany and beyond. Current electronic wildlife warning systems typically monitor only very short distances and therefore cannot provide large-area coverage. This paper presents a novel multisensor approach that integrates radar and infrared (IR) technology into existing roadside delineators. Due to regulatory requirements, delineators are placed at intervals of <inline-formula><mml:math id="M1" display="inline"><mml:mn mathvariant="normal">50</mml:mn></mml:math></inline-formula> m on German country roads. Integrating sensors into these delineators thus provides a uniform infrastructure that can be utilised. The radial extension of the sensor range allows a monitoring zone to be formed along the road. We evaluate thermal infrared arrays and high-resolution <inline-formula><mml:math id="M2" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz radar sensors for range, resolution and robustness under varying environmental conditions. Field measurements in wildlife parks demonstrate that the system can reliably detect deer at distances of up to <inline-formula><mml:math id="M3" display="inline"><mml:mn mathvariant="normal">30</mml:mn></mml:math></inline-formula> m and evaluate their moving speed as well. Challenges such as ambient temperature effects, optical dispersion in IR detection and resolution limits are discussed. The results highlight the potential of multisensor systems to reduce wildlife accidents and improve road safety.</p>
  </abstract>
    
<funding-group>
<award-group id="gs1">
<funding-source>Bundesministeriums für Verkehr, Bau und Stadtentwicklung</funding-source>
<award-id>mFUND</award-id>
</award-group>
</funding-group>
</article-meta>
  </front>
<body>
      

<sec id="Ch1.S1" sec-type="intro">
  <label>1</label><title>Introduction</title>
      <p id="d2e171">An analysis of wildlife accident statistics from the German Hunting Association demonstrates a steady trend of increasing wildlife accidents <xref ref-type="bibr" rid="bib1.bibx7" id="paren.1"/>. In order to assist drivers in avoiding accidents, car developers are turning to driver assistance systems  <xref ref-type="bibr" rid="bib1.bibx12 bib1.bibx2" id="paren.2"/>. Several pilot projects demonstrate how drivers can be actively warned: an electronic wildlife warning system has been tested and developed by the Forestry Testing and Research Institute in Baden-Württemberg, Germany, since 2007. This system involves the installation of a fence alongside the road to guide wild animals to a crossing area. At this point, infrared sensors are programmed to detect the presence of an animal, leading to the triggering of a warning signal for the driver <xref ref-type="bibr" rid="bib1.bibx4" id="paren.3"/>. All existing sensor systems have one key feature in common: they monitor only short sections of 5–7 m, leaving the rest of the road unmonitored. These systems therefore require additional protection for monitoring along the road. As described above, fences for instance are used for this purpose.</p>
      <p id="d2e183">To overcome this limitation, we propose a sensor system that leverages both radar and infrared technologies. Installed on existing roadside delineators spaced <inline-formula><mml:math id="M4" display="inline"><mml:mn mathvariant="normal">50</mml:mn></mml:math></inline-formula> m apart, the sensors monitor a zone extending up to <inline-formula><mml:math id="M5" display="inline"><mml:mn mathvariant="normal">50</mml:mn></mml:math></inline-formula> m parallel to the road. Reliable detection requires the recognition and segmentation of wild animals, whether in radar range-Doppler maps or thermal infrared images. The implementation of a higher resolution than that employed for presence detection is essential for the successful integration of an infrared sensor into a detection and warning segmentation system. Consequently, low-cost IR arrays are a possible option to identify deer using imaging techniques. The combination of modalities enables more robust detection and provides essential data for driver warnings.</p>
</sec>
<sec id="Ch1.S2">
  <label>2</label><title>Sensor system</title>
      <p id="d2e208">It seems appropriate to use sensors that cover the area between the delineators that look outwards and parallel to the road. Geometric considerations for covering a secure area show that sensors attached to a delineator must cover a <inline-formula><mml:math id="M6" display="inline"><mml:mn mathvariant="normal">30</mml:mn></mml:math></inline-formula> m or greater radius in the form of a half circle. Figure <xref ref-type="fig" rid="F1"/> illustrates this. The radial overlap of <inline-formula><mml:math id="M7" display="inline"><mml:mn mathvariant="normal">10</mml:mn></mml:math></inline-formula> m results in an intersection point between two adjacent detection radii. The distance between the road and the intersection point is <inline-formula><mml:math id="M8" display="inline"><mml:mn mathvariant="normal">16.6</mml:mn></mml:math></inline-formula> m, which can be considered sufficiently large.</p>

      <fig id="F1"><label>Figure 1</label><caption><p id="d2e236">The schematic represents a road with delineators. The distance between each delineator is <inline-formula><mml:math id="M9" display="inline"><mml:mn mathvariant="normal">50</mml:mn></mml:math></inline-formula> m, and the sensing radius of the sensors forms a <inline-formula><mml:math id="M10" display="inline"><mml:mn mathvariant="normal">30</mml:mn></mml:math></inline-formula> m half-circle radius around a delineator <xref ref-type="bibr" rid="bib1.bibx14" id="paren.4"/> (<uri>https://creativecommons.org/licenses/by/4.0/</uri>, last access: 11 February 2026).</p></caption>
        <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f01.png"/>

      </fig>

      <p id="d2e265">These requirements were used to select the sensors. These include infrared detectors, which were previously used in wildlife warning systems, as well as thermal sensors in general <xref ref-type="bibr" rid="bib1.bibx4 bib1.bibx1" id="paren.5"/>.  However, directional detection or imaging is required here to recognise the direction in which the animal is moving.</p>
      <p id="d2e272">A new approach that has already been investigated in <xref ref-type="bibr" rid="bib1.bibx8 bib1.bibx13" id="text.6"/> uses radar sensors to detect micro-Doppler signatures for different living persons and animals. <xref ref-type="bibr" rid="bib1.bibx8" id="author.7"/> use a <inline-formula><mml:math id="M11" display="inline"><mml:mn mathvariant="normal">77</mml:mn></mml:math></inline-formula> GHz radar sensor with a bandwidth of <inline-formula><mml:math id="M12" display="inline"><mml:mn mathvariant="normal">3</mml:mn></mml:math></inline-formula> GHz to detect the signatures. The signature of each body part is totally different for each of these subjects. The high bandwidth should make it easy to detect the animal's direction of movement and to identify individual body parts. Based on the results of previous research, a  <inline-formula><mml:math id="M13" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz radar sensor was identified that offers similar resolution but can evaluate larger distances. A mobile measuring system (Fig. <xref ref-type="fig" rid="F2"/>a), that can be easily positioned, was developed for evaluating and testing the limits.</p>

      <fig id="F2" specific-use="star"><label>Figure 2</label><caption><p id="d2e307">Sensor system at a measuring location with animal-proof housing and solar modules in <bold>(a)</bold>. <bold>(b)</bold> shows the sensors.</p></caption>
        <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f02.png"/>

      </fig>

      <p id="d2e322">The sensors are shown in Fig. <xref ref-type="fig" rid="F2"/>b. The RFBeam <inline-formula><mml:math id="M14" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz sensor is located at the top-right corner. To its left, behind special protective glass, is the Boson thermal camera. The evaluation board and its IR array are located at the bottom right.</p>
      <p id="d2e334">On the left are two <inline-formula><mml:math id="M15" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz radar sensors from our project partner, tecVenture. These are aligned at a 20° angle to each other. Above these is a <inline-formula><mml:math id="M16" display="inline"><mml:mn mathvariant="normal">24</mml:mn></mml:math></inline-formula> GHz sensor, also developed by tecVenture, that was neither used nor evaluated.</p>
<sec id="Ch1.S2.SS1">
  <label>2.1</label><title>Thermal sensors</title>
      <p id="d2e359">To evaluate thermal imaging, we used a high-resolution thermal camera from Boson <xref ref-type="bibr" rid="bib1.bibx3" id="paren.8"/>, with a 50° field of view (FoV) lens as a reference for validation and verification. As we do not know any specific emission values for nature in relation to solar radiation, we perform an automatic adjustment using histogram optimisation before each measurement interval. This allows us to achieve optimal contrast between the background and the deer in every scenario. Because these cameras are expensive, we also used HTPA-IR arrays from a Heimann Sensor <xref ref-type="bibr" rid="bib1.bibx6" id="paren.9"/>. These are available in different resolutions, in our case 16 <inline-formula><mml:math id="M17" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 16 and 60 <inline-formula><mml:math id="M18" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 40 pixels. Since the detection of wild animals with thermal imaging cameras depends heavily on ambient light and temperature, the range was evaluated based on indoor tests, whereby range is understood to be the maximum distance at which a person can still be detected at a constant ambient temperature. The indoor comparison was considered appropriate because wildlife crossings mainly occur at twilight. These indoor tests therefore enable measurements to be taken without the influence of sunlight. Thermal solar radiation therefore has less influence. Standard Heimann sensors without adapted lenses were used for the tests. This means that the viewing ranges of the sensors differ (Table <xref ref-type="table" rid="T1"/>).</p>

<table-wrap id="T1"><label>Table 1</label><caption><p id="d2e387">IR arrays.</p></caption><oasis:table frame="topbot"><oasis:tgroup cols="3">
     <oasis:colspec colnum="1" colname="col1" align="left"/>
     <oasis:colspec colnum="2" colname="col2" align="left"/>
     <oasis:colspec colnum="3" colname="col3" align="left"/>
     <oasis:thead>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1">Sensor size</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M19" display="inline"><mml:mrow><mml:mn mathvariant="normal">16</mml:mn><mml:mo>×</mml:mo><mml:mn mathvariant="normal">16</mml:mn></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3"><inline-formula><mml:math id="M20" display="inline"><mml:mrow><mml:mn mathvariant="normal">60</mml:mn><mml:mo>×</mml:mo><mml:mn mathvariant="normal">40</mml:mn></mml:mrow></mml:math></inline-formula></oasis:entry>
       </oasis:row>
     </oasis:thead>
     <oasis:tbody>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1">Sensitivity</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M21" display="inline"><mml:mrow><mml:mn mathvariant="normal">160</mml:mn><mml:mspace linebreak="nobreak" width="0.125em"/><mml:mi mathvariant="normal">m</mml:mi><mml:mspace width="0.125em" linebreak="nobreak"/><mml:mi mathvariant="normal">K</mml:mi></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3"><inline-formula><mml:math id="M22" display="inline"><mml:mrow><mml:mn mathvariant="normal">90</mml:mn><mml:mspace width="0.125em" linebreak="nobreak"/><mml:mi mathvariant="normal">m</mml:mi><mml:mspace linebreak="nobreak" width="0.125em"/><mml:mi mathvariant="normal">K</mml:mi></mml:mrow></mml:math></inline-formula></oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1">FoV</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M23" display="inline"><mml:mn mathvariant="normal">44</mml:mn></mml:math></inline-formula>°</oasis:entry>
         <oasis:entry colname="col3"><inline-formula><mml:math id="M24" display="inline"><mml:mrow><mml:mn mathvariant="normal">92</mml:mn><mml:mo>×</mml:mo><mml:mn mathvariant="normal">59</mml:mn></mml:mrow></mml:math></inline-formula>°</oasis:entry>
       </oasis:row>
     </oasis:tbody>
   </oasis:tgroup></oasis:table></table-wrap>

      <p id="d2e504">For a first evaluation of detecting, a person was positioned in front of the sensors at the same distance and at different ambient temperatures. For example, as Fig. <xref ref-type="fig" rid="F3"/> shows, the person is easily recognisable at a distance of <inline-formula><mml:math id="M25" display="inline"><mml:mn mathvariant="normal">14.5</mml:mn></mml:math></inline-formula> m in a resolution of 60 <inline-formula><mml:math id="M26" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 40 pixels at an ambient temperature of <inline-formula><mml:math id="M27" display="inline"><mml:mn mathvariant="normal">18</mml:mn></mml:math></inline-formula> °C. However, at a resolution of 16 <inline-formula><mml:math id="M28" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 16 pixels, the person can no longer be identified due to the lower resolution in the same ambient temperature. This can be described by the spot size ratio (<xref ref-type="bibr" rid="bib1.bibx10" id="altparen.10"/>). If this is converted to the pixel area <inline-formula><mml:math id="M29" display="inline"><mml:mi>A</mml:mi></mml:math></inline-formula>, Eq. (<xref ref-type="disp-formula" rid="Ch1.E1"/>) is obtained. Here, <inline-formula><mml:math id="M30" display="inline"><mml:mi>d</mml:mi></mml:math></inline-formula> corresponds to the object distance and<inline-formula><mml:math id="M31" display="inline"><mml:mi>n</mml:mi></mml:math></inline-formula> corresponds to the number of pixels in the azimuth and elevation directions, respectively. However, using this equation to determine the maximum distance between objects in correlation to object size is incorrect. The reason for this is optical dispersion. This phenomenon describes how the thermal radiation, emitted from a small area, does not provide sufficient energy to the individual pixel. For a reliable measurement, the measuring point should therefore cover a resolution of at least 3 <inline-formula><mml:math id="M32" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 3 pixels <xref ref-type="bibr" rid="bib1.bibx19" id="paren.11"/>. According to this theory, the target is already too far away for the sensor in Fig. <xref ref-type="fig" rid="F3"/>a, because if the ambient temperature rises, only one pixel would show the person. Therefore, the further measurements are limited to the IR array with the 60 <inline-formula><mml:math id="M33" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 40 pixels. This sensor is therefore appropriate for the distance mentioned above, up to an ambient temperature of approximately <inline-formula><mml:math id="M34" display="inline"><mml:mn mathvariant="normal">26</mml:mn></mml:math></inline-formula> °C.

            <disp-formula id="Ch1.E1" content-type="numbered"><label>1</label><mml:math id="M35" display="block"><mml:mrow><mml:mi>A</mml:mi><mml:mo>=</mml:mo><mml:msup><mml:mi>d</mml:mi><mml:mn mathvariant="normal">2</mml:mn></mml:msup><mml:mo>⋅</mml:mo><mml:mstyle displaystyle="true"><mml:mfrac style="display"><mml:mrow><mml:msub><mml:mi mathvariant="normal">FoV</mml:mi><mml:mi mathvariant="normal">azimuth</mml:mi></mml:msub><mml:mo>⋅</mml:mo><mml:msub><mml:mi mathvariant="normal">FoV</mml:mi><mml:mi mathvariant="normal">elevation</mml:mi></mml:msub><mml:mo>⋅</mml:mo><mml:msup><mml:mi mathvariant="italic">π</mml:mi><mml:mn mathvariant="normal">2</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:msub><mml:mi>n</mml:mi><mml:mi mathvariant="normal">azimuth</mml:mi></mml:msub><mml:mo>⋅</mml:mo><mml:msub><mml:mi>n</mml:mi><mml:mi mathvariant="normal">elevation</mml:mi></mml:msub><mml:mo>⋅</mml:mo><mml:msup><mml:mn mathvariant="normal">180</mml:mn><mml:mn mathvariant="normal">2</mml:mn></mml:msup></mml:mrow></mml:mfrac></mml:mstyle></mml:mrow></mml:math></disp-formula></p>

      <fig id="F3" specific-use="star"><label>Figure 3</label><caption><p id="d2e652">Person located <inline-formula><mml:math id="M36" display="inline"><mml:mn mathvariant="normal">14.5</mml:mn></mml:math></inline-formula> m away in front of a 16 <inline-formula><mml:math id="M37" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 16 array <bold>(a)</bold> and a 60 <inline-formula><mml:math id="M38" display="inline"><mml:mo>×</mml:mo></mml:math></inline-formula> 40 array <bold>(b)</bold>.</p></caption>
          <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f03.png"/>

        </fig>

      <p id="d2e688">Practical measurements show that the presence of deer at a reference ambient temperature of <inline-formula><mml:math id="M39" display="inline"><mml:mn mathvariant="normal">20</mml:mn></mml:math></inline-formula> °C can be recognised both in the IR array and in the high-resolution Boson camera (Fig. <xref ref-type="fig" rid="F4"/>). For our proof of concept, the standard configuration provided by Heimann, as shown in Table <xref ref-type="table" rid="T1"/>, is therefore sufficient. To extend the FoV to <inline-formula><mml:math id="M40" display="inline"><mml:mn mathvariant="normal">90</mml:mn></mml:math></inline-formula>°, either a second sensor or a lens can be installed. However, as our evaluation shows, a larger FoV is linked to temperature resolution, which leads to lower distance resolution under the same environmental conditions. The single deer is <inline-formula><mml:math id="M41" display="inline"><mml:mn mathvariant="normal">18</mml:mn></mml:math></inline-formula> m away and, due to its body orientation, emits a large amount of heat, which the IR array maps over several pixels. By contrast, the deer on the left of the image is slightly farther away and aligned with the sensor system, resulting in a much lower heat signature.</p>

      <fig id="F4" specific-use="star"><label>Figure 4</label><caption><p id="d2e718">Two deer are located approximately <inline-formula><mml:math id="M42" display="inline"><mml:mn mathvariant="normal">18</mml:mn></mml:math></inline-formula> m from the sensor system. <bold>(a)</bold> shows the high-resolution thermal image captured by the Boson camera, while <bold>(b)</bold> shows the image captured by the IR array.</p></caption>
          <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f04.png"/>

        </fig>

      <p id="d2e740">As part of an initial trial, a convolutional neural network (CNN) was evaluated for its capacity to automatically detect deer that pose a threat to traffic safety. Due to the inability to clearly identify detection in radar data, the co-registration approach is to be utilised. Prior to the execution of the aforementioned procedure, it is necessary to first evaluate the thermal images. Given the simplicity and rapid detection rate of the YOLO (You Only Look Once) model, it was selected as the primary approach for the segmentation process. As in <xref ref-type="bibr" rid="bib1.bibx18" id="text.12"/> and <xref ref-type="bibr" rid="bib1.bibx16" id="text.13"/>, the YOLO <xref ref-type="bibr" rid="bib1.bibx17" id="paren.14"/>  CNN was used for automatic referencing. To achieve this, the 16-bit greyscale images from the Boson camera were cropped to 8 bits using mean-value nomination. This is necessary because thermal intensity is expressed as a relative number of 16 bits; thus, the measured relative temperature of each pixel is mapped to this value range. Additionally, YOLO is optimised for evaluating 8-bit RGB images. Due to the CNN's architecture, YOLO always requires three individual images (R, G, and B), which are superimposed to form the overall image. To analyse the impact of converting 16-bit thermal images to 8-bit false colours, YOLO was trained twice with the same images and data distribution (Table <xref ref-type="table" rid="T2"/>). For this evaluation, a random selection of images was chosen, which were manually marked and provided with bounding boxes indicating where and how many animals were present. The selected images were then divided into two classes: with deer and without deer. Three sets of different sizes were formed from these classes (Table <xref ref-type="table" rid="T2"/>). Due to the false-colour representation and the greyscale scaling, there are now six sets, three each for processing in the CNN.</p>

<table-wrap id="T2"><label>Table 2</label><caption><p id="d2e759">Data records for the two YOLO evaluations.</p></caption><oasis:table frame="topbot"><oasis:tgroup cols="2">
     <oasis:colspec colnum="1" colname="col1" align="left"/>
     <oasis:colspec colnum="2" colname="col2" align="left"/>
     <oasis:thead>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1"/>
         <oasis:entry colname="col2">Data set size</oasis:entry>
       </oasis:row>
     </oasis:thead>
     <oasis:tbody>
       <oasis:row>
         <oasis:entry colname="col1">Train</oasis:entry>
         <oasis:entry colname="col2">1520 images</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1">Valid</oasis:entry>
         <oasis:entry colname="col2">867 images</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1">Test</oasis:entry>
         <oasis:entry colname="col2">181 images</oasis:entry>
       </oasis:row>
     </oasis:tbody>
   </oasis:tgroup></oasis:table></table-wrap>

      <p id="d2e813">After determining the range of the mean value and standard deviation, the single-channel greyscale image is used as a three-channel image in the first training case. This means the grey image is composed three times as RGB and transferred to YOLO (Fig. <xref ref-type="fig" rid="F5"/>a). In the second training case, the greyscale image is coloured using a false-colour representation of the “jet” colour map and can be split into a three-channel RGB image (Fig. <xref ref-type="fig" rid="F5"/>b) as a normal RGB image.</p>

      <fig id="F5" specific-use="star"><label>Figure 5</label><caption><p id="d2e822">Both images show a deer after mean-value normalisation: <bold>(a)</bold> in greyscale and <bold>(b)</bold> in a false-colour representation.</p></caption>
          <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f05.png"/>

        </fig>

      <p id="d2e837">The recall is particularly noteworthy when assessing the training cases, as it demonstrates the number of deer that can be detected. The false-colour representation is <inline-formula><mml:math id="M43" display="inline"><mml:mrow><mml:mn mathvariant="normal">84.4</mml:mn><mml:mspace linebreak="nobreak" width="0.125em"/><mml:mrow class="unit"><mml:mi mathvariant="normal">%</mml:mi></mml:mrow></mml:mrow></mml:math></inline-formula>, about 3 points higher than the grey image. The precision evaluation shows an even smaller difference, with the false-colour representation at <inline-formula><mml:math id="M44" display="inline"><mml:mrow><mml:mn mathvariant="normal">85.4</mml:mn><mml:mspace width="0.125em" linebreak="nobreak"/><mml:mrow class="unit"><mml:mi mathvariant="normal">%</mml:mi></mml:mrow></mml:mrow></mml:math></inline-formula>, only 2 points higher. Deer are mainly not detected in the recall when they are at a great distance (<inline-formula><mml:math id="M45" display="inline"><mml:mn mathvariant="normal">20</mml:mn></mml:math></inline-formula> m or more) from the measuring system or when a large part of the animal is obscured by vegetation.</p>

<table-wrap id="T3"><label>Table 3</label><caption><p id="d2e874">YOLO results.</p></caption><oasis:table frame="topbot"><oasis:tgroup cols="3">
     <oasis:colspec colnum="1" colname="col1" align="left"/>
     <oasis:colspec colnum="2" colname="col2" align="right"/>
     <oasis:colspec colnum="3" colname="col3" align="right"/>
     <oasis:thead>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1"/>
         <oasis:entry colname="col2">Precision</oasis:entry>
         <oasis:entry colname="col3">Recall</oasis:entry>
       </oasis:row>
     </oasis:thead>
     <oasis:tbody>
       <oasis:row>
         <oasis:entry colname="col1">Greyscale</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M46" display="inline"><mml:mrow><mml:mn mathvariant="normal">83.2</mml:mn><mml:mspace linebreak="nobreak" width="0.125em"/><mml:mrow class="unit"><mml:mi mathvariant="normal">%</mml:mi></mml:mrow></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3"><inline-formula><mml:math id="M47" display="inline"><mml:mrow><mml:mn mathvariant="normal">81.1</mml:mn><mml:mspace width="0.125em" linebreak="nobreak"/><mml:mrow class="unit"><mml:mi mathvariant="normal">%</mml:mi></mml:mrow></mml:mrow></mml:math></inline-formula></oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1">False colour</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M48" display="inline"><mml:mrow><mml:mn mathvariant="normal">85.4</mml:mn><mml:mspace width="0.125em" linebreak="nobreak"/><mml:mrow class="unit"><mml:mi mathvariant="normal">%</mml:mi></mml:mrow></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3"><inline-formula><mml:math id="M49" display="inline"><mml:mrow><mml:mn mathvariant="normal">84.4</mml:mn><mml:mspace width="0.125em" linebreak="nobreak"/><mml:mrow class="unit"><mml:mi mathvariant="normal">%</mml:mi></mml:mrow></mml:mrow></mml:math></inline-formula></oasis:entry>
       </oasis:row>
     </oasis:tbody>
   </oasis:tgroup></oasis:table></table-wrap>

      <p id="d2e969">This signifies that the acquisition of thermal data is especially advantageous in environments with substantial vegetation. In open terrain evaluating images in greyscale is sufficient to detect animals.</p>
</sec>
<sec id="Ch1.S2.SS2">
  <label>2.2</label><title>Radar sensors</title>
      <p id="d2e981">Radar detection depends strongly on carrier frequency, as higher frequencies reduce range but improve resolution. Free-space path loss (Eq. <xref ref-type="disp-formula" rid="Ch1.E2"/>) <xref ref-type="bibr" rid="bib1.bibx5" id="paren.15"/> and range resolution (Eq. <xref ref-type="disp-formula" rid="Ch1.E3"/>) define sensor performance. For the FSPL, the maximum range <inline-formula><mml:math id="M50" display="inline"><mml:mi>r</mml:mi></mml:math></inline-formula> of interest is of primary importance, which depends directly on the carrier frequency <inline-formula><mml:math id="M51" display="inline"><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mi mathvariant="normal">c</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula>, with <inline-formula><mml:math id="M52" display="inline"><mml:mi>c</mml:mi></mml:math></inline-formula> describing the speed of light as the propagation speed in air.

            <disp-formula id="Ch1.E2" content-type="numbered"><label>2</label><mml:math id="M53" display="block"><mml:mrow><mml:mi mathvariant="normal">FSPL</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">20</mml:mn><mml:mi>log⁡</mml:mi><mml:mn mathvariant="normal">10</mml:mn><mml:mo>(</mml:mo><mml:mi>r</mml:mi><mml:mo>⋅</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mi mathvariant="normal">c</mml:mi></mml:msub><mml:mo>⋅</mml:mo><mml:mstyle displaystyle="true"><mml:mfrac style="display"><mml:mrow><mml:mn mathvariant="normal">4</mml:mn><mml:mi mathvariant="italic">π</mml:mi></mml:mrow><mml:mi>c</mml:mi></mml:mfrac></mml:mstyle><mml:mo>)</mml:mo></mml:mrow></mml:math></disp-formula>

          Due to the permitted bandwidths BW, a higher baseband frequency provides more accurate resolution <inline-formula><mml:math id="M54" display="inline"><mml:mrow><mml:mi mathvariant="normal">Δ</mml:mi><mml:mi>R</mml:mi></mml:mrow></mml:math></inline-formula> <xref ref-type="bibr" rid="bib1.bibx9" id="paren.16"/>.

            <disp-formula id="Ch1.E3" content-type="numbered"><label>3</label><mml:math id="M55" display="block"><mml:mrow><mml:mi mathvariant="normal">Δ</mml:mi><mml:mi>R</mml:mi><mml:mo>=</mml:mo><mml:mstyle displaystyle="true"><mml:mfrac style="display"><mml:mi>c</mml:mi><mml:mrow><mml:mn mathvariant="normal">2</mml:mn><mml:mo>⋅</mml:mo><mml:mi mathvariant="normal">BW</mml:mi></mml:mrow></mml:mfrac></mml:mstyle></mml:mrow></mml:math></disp-formula></p>
      <p id="d2e1093">The <inline-formula><mml:math id="M56" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz radar sensor from RFBeam with integrated signal processing enables a maximum range of <inline-formula><mml:math id="M57" display="inline"><mml:mrow><mml:mi>r</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">30</mml:mn></mml:mrow></mml:math></inline-formula> m with a range resolution of <inline-formula><mml:math id="M58" display="inline"><mml:mrow><mml:mi mathvariant="normal">Δ</mml:mi><mml:mi>R</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">0.23</mml:mn></mml:mrow></mml:math></inline-formula> m and a velocity resolution of <inline-formula><mml:math id="M59" display="inline"><mml:mrow><mml:mi mathvariant="normal">Δ</mml:mi><mml:mi>v</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">0.43</mml:mn></mml:mrow></mml:math></inline-formula> s (Fig. <xref ref-type="fig" rid="F6"/>) and still has an FoV of azimuth <inline-formula><mml:math id="M60" display="inline"><mml:mrow><mml:mo>=</mml:mo><mml:mi mathvariant="normal">elevation</mml:mi><mml:mo>=</mml:mo><mml:mo>±</mml:mo><mml:mn mathvariant="normal">86</mml:mn></mml:mrow></mml:math></inline-formula>°. This makes it suitable for installation in our sensor system. Using a moving target indication filter <xref ref-type="bibr" rid="bib1.bibx15" id="paren.17"/>, static targets can be filtered without compromising the signal-to-noise ratio. This makes it possible to detect a deer at a distance of up to <inline-formula><mml:math id="M61" display="inline"><mml:mrow><mml:mi>r</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">30</mml:mn></mml:mrow></mml:math></inline-formula> m.</p>

      <fig id="F6" specific-use="star"><label>Figure 6</label><caption><p id="d2e1179">Range Doppler maps of the RFBeam reference sensor, in panel  <bold>(a)</bold> without MTI filter, in<bold>(b)</bold> with MTI filter. The deer is located at <inline-formula><mml:math id="M62" display="inline"><mml:mn mathvariant="normal">11.5</mml:mn></mml:math></inline-formula> m and is moving at approximately 1 s. The remaining targets in <bold>(b)</bold> are false detections or caused by noise.</p></caption>
          <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f06.png"/>

        </fig>

      <p id="d2e1205">In collaboration with tecVenture, we developed a custom <inline-formula><mml:math id="M63" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz radar sensor using the BGT60UTR11AIP chip <xref ref-type="bibr" rid="bib1.bibx11" id="paren.18"/>. The system on chip (SoC) enables a simple system design. An STM32 serves as the micro-controller unit and configures the radar SoC. It also performs the initial FFT of the signal processing chain. The principle of a lens outlined from the IR arrays can be adapted here, and it has been implemented because the used chip has an FoV of <inline-formula><mml:math id="M64" display="inline"><mml:mn mathvariant="normal">180</mml:mn></mml:math></inline-formula>° in azimuth and elevation. With the lens, the FoV can be limited to <inline-formula><mml:math id="M65" display="inline"><mml:mn mathvariant="normal">70</mml:mn></mml:math></inline-formula>°. This results in a wider range of up to <inline-formula><mml:math id="M66" display="inline"><mml:mrow><mml:mi>r</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">30</mml:mn></mml:mrow></mml:math></inline-formula> m. Due to the parameters used, a higher <inline-formula><mml:math id="M67" display="inline"><mml:mrow><mml:mi mathvariant="normal">Δ</mml:mi><mml:mi>R</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">0.075</mml:mn></mml:mrow></mml:math></inline-formula> m than with the RFBeam reference sensor can be achieved.  In Fig. <xref ref-type="fig" rid="F7"/>, three stationary deer were captured with the tecVenture sensor in a wildlife park. For this measurement, the MTI filter is deactivated, as the deer are located at static distances at feeding places.</p>

      <fig id="F7"><label>Figure 7</label><caption><p id="d2e1263">Radar measurement of the tecVenture <inline-formula><mml:math id="M68" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz sensor. There are three upright deer at distances of <inline-formula><mml:math id="M69" display="inline"><mml:mn mathvariant="normal">18</mml:mn></mml:math></inline-formula>, <inline-formula><mml:math id="M70" display="inline"><mml:mn mathvariant="normal">25</mml:mn></mml:math></inline-formula> and <inline-formula><mml:math id="M71" display="inline"><mml:mn mathvariant="normal">31</mml:mn></mml:math></inline-formula> m from the sensor system.</p></caption>
          <graphic xlink:href="https://jsss.copernicus.org/articles/15/27/2026/jsss-15-27-2026-f07.png"/>

        </fig>


</sec>
</sec>
<sec id="Ch1.S3" sec-type="conclusions">
  <label>3</label><title>Conclusions</title>
      <p id="d2e1311">The presented multisensor system successfully combines IR arrays and <inline-formula><mml:math id="M72" display="inline"><mml:mn mathvariant="normal">60</mml:mn></mml:math></inline-formula> GHz radar sensors to detect wild animals near roads. Tests demonstrated the reliable recognition of deer at distances of up to <inline-formula><mml:math id="M73" display="inline"><mml:mn mathvariant="normal">30</mml:mn></mml:math></inline-formula> m, although performance is influenced by ambient temperature, animal orientation and pixel resolution. Lenses can extend detection range but at the cost of narrower FoV.</p>
      <p id="d2e1328">Radar sensors complement thermal imaging by providing robust motion detection and higher resolution for distinguishing animals. Although adult deer were reliably detected, further research is required for smaller animals. The detection of deer has already been tested in high-resolution thermal camera images, but adaptation to IR arrays is still pending. Detection using YOLO works well in both greyscale images and false-colour representations, and can be used for feature extraction. This is important for automatic labelling and verification of AI-based detection in radar images.</p>
      <p id="d2e1331">Ongoing work explores the AI-based classification of thermal images of the IR arrays and motion patterns in range Doppler maps, also using lightweight detection networks such as YOLO. This integrated approach shows strong potential for large-scale deployment, leveraging existing delineator infrastructure to enhance traffic safety and reduce wildlife-related accidents.</p>
</sec>

      
      </body>
    <back><notes notes-type="codedataavailability"><title>Code and data availability</title>

      <p id="d2e1339">Code and data are available upon request.</p>
  </notes><notes notes-type="authorcontribution"><title>Author contributions</title>

      <p id="d2e1345">The campaign was meticulously planned by HM, TW and MMC. TW and HM acquired the funding for the project and did the administration.  The development of the radar systems was undertaken by JB, AR and NH. MS set up the measuring system and wrote the paper.</p>
  </notes><notes notes-type="competinginterests"><title>Competing interests</title>

      <p id="d2e1352">The authors Mike Montoya-Capote, Jonas Berger, Andreas Reichel and Nils Hollmach are employed by the company tecVenture. The peer-review process was guided by an independent editor, and the authors have no other competing interests to declare.</p>
  </notes><notes notes-type="disclaimer"><title>Disclaimer</title>

      <p id="d2e1358">Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. The authors bear the ultimate responsibility for providing appropriate place names. Views expressed in the text are those of the authors and do not necessarily reflect the views of the publisher.</p>
  </notes><notes notes-type="sistatement"><title>Special issue statement</title>

      <p id="d2e1364">This article is part of the special issue “Sensors and Measurement Science International SMSI 2025”. It is a result of the 2025 Sensor and Measurement Science International (SMSI) Conference, Nuremberg, Germany, 6–8 May 2025.</p>
  </notes><ack><title>Acknowledgements</title><p id="d2e1370">This work was supported by the German Federal Ministry of Transport (<uri>https://www.bmv.de/EN/Home/home.html</uri>, last access: 28 January 2026) as the mfund Project “OhDeer”.</p></ack><notes notes-type="financialsupport"><title>Financial support</title>

      <p id="d2e1378">This research has been supported by the Bundesministeriums für Verkehr, Bau und Stadtentwicklung (mFUND).</p>
  </notes><notes notes-type="reviewstatement"><title>Review statement</title>

      <p id="d2e1384">This paper was edited by Marco Jose da Silva and reviewed by two anonymous referees.</p>
  </notes><ref-list>
    <title>References</title>

      <ref id="bib1.bibx1"><label>Animoth(2025)</label><mixed-citation>Animoth: <uri>https://animot.eu/</uri> (last access: 8 December 2025), 2025.</mixed-citation></ref>
      <ref id="bib1.bibx2"><label>Car(2011)</label><mixed-citation>Car, V.: Volvo Car Corporation develops technology to avoid collisions with wild animals, Volvocars, <uri>https://www.volvocars.com/us/media/press-releases/57BD69B42F99B2A3/</uri> (last access: 27 January 2026), 2011.</mixed-citation></ref>
      <ref id="bib1.bibx3"><label>Flir(2024)</label><mixed-citation>Flir: Boson640, <uri>https://www.flir.de/products/boson/</uri> (last access: 18 October 2024), 2024.</mixed-citation></ref>
      <ref id="bib1.bibx4"><label>FVA(2008)</label><mixed-citation>FVA: Pilotprojekt Elektronische WildwarnanlageB292 bei Aglasterhausen, <uri>https://www.fva-bw.de/fileadmin/scripts/forschung/wg/081014wildwarn_ber.pdf</uri> (last access: 17 October 2024), 2008.</mixed-citation></ref>
      <ref id="bib1.bibx5"><label>Gil et al.(2021)Gil, Lee, and Cho</label><mixed-citation>Gil, G.-T., Lee, J. Y., and Cho, D.-H.: Estimation of Path Loss Parameters of a Sub-Terahertz Wireless Channel Using Monostatic Radar, IEEE Access, 9, 52654–52663, <ext-link xlink:href="https://doi.org/10.1109/ACCESS.2021.3070378" ext-link-type="DOI">10.1109/ACCESS.2021.3070378</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx6"><label>Heimann(2024)</label><mixed-citation>Heimann: Heimann Sensor, <uri>https://www.heimannsensor.com/</uri> (last access: 20 October 2024), 2024.</mixed-citation></ref>
      <ref id="bib1.bibx7"><label>Kauer/DJV(2023)</label><mixed-citation>Kauer/DJV: Wildunfallstatistik, <uri>https://www.jagdverband.de/zahlen-fakten/jagd-und-wildunfallstatistik/wildunfallstatistik</uri> (last access: 18 Ocotber 2024), 2023.</mixed-citation></ref>
      <ref id="bib1.bibx8"><label>Lavrenko et al.(2021)Lavrenko, Gessler, Walter, Mantz, and Schlick</label><mixed-citation>Lavrenko, T., Gessler, T., Walter, T., Mantz, H., and Schlick, M.: Radar Based Detection and Classification of Vulnerable Road Users, in: The 8th International Symposium on Sensor Science, I3S 2021, MDPI, p. 67, <ext-link xlink:href="https://doi.org/10.3390/i3s2021dresden-10098" ext-link-type="DOI">10.3390/i3s2021dresden-10098</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx9"><label>Lee et al.(2020)Lee, Dinc, and Valdes-Garcia</label><mixed-citation>Lee, W., Dinc, T., and Valdes-Garcia, A.: Multi-Mode 60-GHz Radar Transmitter SoC in 45-nm SOI CMOS, IEEE Journal of Solid-State Circuits, 55, 1187–1198, <ext-link xlink:href="https://doi.org/10.1109/JSSC.2020.2964150" ext-link-type="DOI">10.1109/JSSC.2020.2964150</ext-link>, 2020.</mixed-citation></ref>
      <ref id="bib1.bibx10"><label>Playà‐Montmany and Tattersall(2021)</label><mixed-citation>Playà‐Montmany, N. and Tattersall, G. J.: Spot size, distance and emissivity errors in field applications of infrared thermography, Methods in Ecology and Evolution, 12, 828–840, <ext-link xlink:href="https://doi.org/10.1111/2041-210x.13563" ext-link-type="DOI">10.1111/2041-210x.13563</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx11"><label>rfbeam(2024)</label><mixed-citation>rfbeam: Md3-radar-transceiver, <uri>https://rfbeam.ch/product/v-md3-radar-transceiver/</uri> (last access: 17 October 2024),  2024.</mixed-citation></ref>
      <ref id="bib1.bibx12"><label>Rigling et al.(2022)</label><mixed-citation>Rigling, A., Sandner, V., and Kolke, R.: ADAC testet Assistenzsysteme auf Wildtier-Erkennung, Das Fachmagazin für Verkehrsunfall und Fahrzeugtechnik, 432–437,  <uri>https://www.vkuonline.de/adac-testet-assistenzsysteme-auf-wildtier-erkennung-3289235.html</uri> (last access: 11 February 2026), 2022.</mixed-citation></ref>
      <ref id="bib1.bibx13"><label>Rippl et al.(2020)Rippl, Iberle, Mutschler, Scharf, Mantz, and Walter</label><mixed-citation>Rippl, P., Iberle, J., Mutschler, M. A., Scharf, P. A., Mantz, H., and Walter, T.: Analysis of pedestrian gait patterns using radar based Micro-Doppler Signatures for the protection of vulnerable road users, in: 2020 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), IEEE, 4 pp., <ext-link xlink:href="https://doi.org/10.1109/icmim48759.2020.9299029" ext-link-type="DOI">10.1109/icmim48759.2020.9299029</ext-link>, 2020.</mixed-citation></ref>
      <ref id="bib1.bibx14"><label>Schneider et al.(2025)Schneider, Mantz, Walter, Montoya-Capote, Berger, Reichel, and Hollmach</label><mixed-citation>Schneider, M., Mantz, H., Walter, T., Montoya-Capote, M., Berger, J., Reichel, A., and Hollmach, N.: D7.1 – Recognising Wild Animals on Roads: Radar-based Sensor Systems for Accident Avoidance, in: Lectures, AMA Service GmbH,  228–229, <ext-link xlink:href="https://doi.org/10.5162/smsi2025/d7.1" ext-link-type="DOI">10.5162/smsi2025/d7.1</ext-link>, 2025.</mixed-citation></ref>
      <ref id="bib1.bibx15"><label>Tohidi et al.(2015)Tohidi, Radmard, Karbasi, Behroozi, and Nayebi</label><mixed-citation>Tohidi, E., Radmard, M., Karbasi, S. M., Behroozi, H., and Nayebi, M. M.: Compressive sensing in MTI processing, in: 2015 3rd International Workshop on Compressed Sensing Theory and its Applications to Radar, Sonar and Remote Sensing (CoSeRa), IEEE, 1, 189–193, <ext-link xlink:href="https://doi.org/10.1109/cosera.2015.7330290" ext-link-type="DOI">10.1109/cosera.2015.7330290</ext-link>, 2015.</mixed-citation></ref>
      <ref id="bib1.bibx16"><label>Turmaganbet et al.(2025)Turmaganbet, Zhexebay, Turlykozhayeva, Skabylov, Akhtanov, Temesheva, Tao, and Masalim</label><mixed-citation>Turmaganbet, U., Zhexebay, D., Turlykozhayeva, D., Skabylov, A., Akhtanov, S., Temesheva, S., Tao, M., and Masalim, P.: Thermal infrared object detection with YOLO models, Eurasian Physical Technical Journal, 22, 121–132, <ext-link xlink:href="https://doi.org/10.31489/2025n2/121-132" ext-link-type="DOI">10.31489/2025n2/121-132</ext-link>, 2025.</mixed-citation></ref>
      <ref id="bib1.bibx17"><label>Ultralytics(2023)</label><mixed-citation>Ultralytics: YOLOv8 Architecture, Ultralytics Docs, <uri>https://yolov8.org/yolov8-architecture/</uri> (last access: 12 September 2025), 2023.</mixed-citation></ref>
      <ref id="bib1.bibx18"><label>Wang et al.(2024)Wang, Ren, and Zhang</label><mixed-citation>Wang, T., Ren, S., and Zhang, H.: Nighttime wildlife object detection based on YOLOv8‐night, Electronics Letters, 60, <ext-link xlink:href="https://doi.org/10.1049/ell2.13305" ext-link-type="DOI">10.1049/ell2.13305</ext-link>, 2024.</mixed-citation></ref>
      <ref id="bib1.bibx19"><label>Wang et al.(2019)Wang, Yin, Zhang, and Yan</label><mixed-citation>Wang, X., Yin, J., Zhang, K., and Yan, J.: Research on dispersion phenomenon of infrared imaging system based on black body, Optik, 185, 405–413, <ext-link xlink:href="https://doi.org/10.1016/j.ijleo.2019.03.115" ext-link-type="DOI">10.1016/j.ijleo.2019.03.115</ext-link>, 2019.</mixed-citation></ref>

  </ref-list></back>
    <!--<article-title-html>Recognising wild animals on roads: multisensor systems for accident avoidance</article-title-html>
<abstract-html/>
<ref-html id="bib1.bib1"><label>Animoth(2025)</label><mixed-citation>
      
Animoth: <a href="https://animot.eu/" target="_blank"/> (last access: 8 December 2025), 2025.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib2"><label>Car(2011)</label><mixed-citation>
      
Car, V.: Volvo Car Corporation develops technology to avoid collisions with
wild animals, Volvocars,
<a href="https://www.volvocars.com/us/media/press-releases/57BD69B42F99B2A3/" target="_blank"/> (last access: 27 January 2026),
2011.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib3"><label>Flir(2024)</label><mixed-citation>
      
Flir: Boson640,
<a href="https://www.flir.de/products/boson/" target="_blank"/> (last access: 18 October 2024), 2024.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib4"><label>FVA(2008)</label><mixed-citation>
      
FVA: Pilotprojekt Elektronische WildwarnanlageB292 bei Aglasterhausen,
<a href="https://www.fva-bw.de/fileadmin/scripts/forschung/wg/081014wildwarn_ber.pdf" target="_blank"/> (last access: 17 October 2024), 2008.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib5"><label>Gil et al.(2021)Gil, Lee, and Cho</label><mixed-citation>
      
Gil, G.-T., Lee, J. Y., and Cho, D.-H.: Estimation of Path Loss Parameters of a
Sub-Terahertz Wireless Channel Using Monostatic Radar, IEEE Access, 9,
52654–52663, <a href="https://doi.org/10.1109/ACCESS.2021.3070378" target="_blank">https://doi.org/10.1109/ACCESS.2021.3070378</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib6"><label>Heimann(2024)</label><mixed-citation>
      
Heimann: Heimann Sensor, <a href="https://www.heimannsensor.com/" target="_blank"/> (last access: 20 October 2024), 2024.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib7"><label>Kauer/DJV(2023)</label><mixed-citation>
      
Kauer/DJV: Wildunfallstatistik,
<a href="https://www.jagdverband.de/zahlen-fakten/jagd-und-wildunfallstatistik/wildunfallstatistik" target="_blank"/> (last access: 18 Ocotber 2024), 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib8"><label>Lavrenko et al.(2021)Lavrenko, Gessler, Walter, Mantz, and
Schlick</label><mixed-citation>
      
Lavrenko, T., Gessler, T., Walter, T., Mantz, H., and Schlick, M.: Radar Based
Detection and Classification of Vulnerable Road Users, in: The 8th
International Symposium on Sensor Science, I3S 2021, MDPI, p. 67,
<a href="https://doi.org/10.3390/i3s2021dresden-10098" target="_blank">https://doi.org/10.3390/i3s2021dresden-10098</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib9"><label>Lee et al.(2020)Lee, Dinc, and Valdes-Garcia</label><mixed-citation>
      
Lee, W., Dinc, T., and Valdes-Garcia, A.: Multi-Mode 60-GHz Radar Transmitter
SoC in 45-nm SOI CMOS, IEEE Journal of Solid-State Circuits, 55, 1187–1198,
<a href="https://doi.org/10.1109/JSSC.2020.2964150" target="_blank">https://doi.org/10.1109/JSSC.2020.2964150</a>, 2020.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib10"><label>Playà‐Montmany and Tattersall(2021)</label><mixed-citation>
      
Playà‐Montmany, N. and Tattersall, G. J.: Spot size, distance and emissivity errors in field applications of infrared thermography, Methods in Ecology and Evolution, 12, 828–840, <a href="https://doi.org/10.1111/2041-210x.13563" target="_blank">https://doi.org/10.1111/2041-210x.13563</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib11"><label>rfbeam(2024)</label><mixed-citation>
      
rfbeam: Md3-radar-transceiver,
<a href="https://rfbeam.ch/product/v-md3-radar-transceiver/" target="_blank"/> (last access: 17 October 2024),  2024.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib12"><label>Rigling et al.(2022)</label><mixed-citation>
      
Rigling, A., Sandner, V., and Kolke, R.: ADAC testet Assistenzsysteme auf
Wildtier-Erkennung, Das Fachmagazin für Verkehrsunfall und Fahrzeugtechnik, 432–437,  <a href="https://www.vkuonline.de/adac-testet-assistenzsysteme-auf-wildtier-erkennung-3289235.html" target="_blank"/> (last access: 11 February 2026), 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib13"><label>Rippl et al.(2020)Rippl, Iberle, Mutschler, Scharf, Mantz, and
Walter</label><mixed-citation>
      
Rippl, P., Iberle, J., Mutschler, M. A., Scharf, P. A., Mantz, H., and Walter,
T.: Analysis of pedestrian gait patterns using radar based Micro-Doppler
Signatures for the protection of vulnerable road users, in: 2020 IEEE MTT-S
International Conference on Microwaves for Intelligent Mobility (ICMIM), IEEE, 4 pp., <a href="https://doi.org/10.1109/icmim48759.2020.9299029" target="_blank">https://doi.org/10.1109/icmim48759.2020.9299029</a>, 2020.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib14"><label>Schneider et al.(2025)Schneider, Mantz, Walter, Montoya-Capote,
Berger, Reichel, and Hollmach</label><mixed-citation>
      
Schneider, M., Mantz, H., Walter, T., Montoya-Capote, M., Berger, J., Reichel,
A., and Hollmach, N.: D7.1 – Recognising Wild Animals on Roads: Radar-based
Sensor Systems for Accident Avoidance, in: Lectures, AMA
Service GmbH,  228–229,
<a href="https://doi.org/10.5162/smsi2025/d7.1" target="_blank">https://doi.org/10.5162/smsi2025/d7.1</a>, 2025.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib15"><label>Tohidi et al.(2015)Tohidi, Radmard, Karbasi, Behroozi, and
Nayebi</label><mixed-citation>
      
Tohidi, E., Radmard, M., Karbasi, S. M., Behroozi, H., and Nayebi, M. M.:
Compressive sensing in MTI processing, in: 2015 3rd International Workshop on
Compressed Sensing Theory and its Applications to Radar, Sonar and Remote
Sensing (CoSeRa), IEEE, 1, 189–193,
<a href="https://doi.org/10.1109/cosera.2015.7330290" target="_blank">https://doi.org/10.1109/cosera.2015.7330290</a>, 2015.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib16"><label>Turmaganbet et al.(2025)Turmaganbet, Zhexebay, Turlykozhayeva,
Skabylov, Akhtanov, Temesheva, Tao, and Masalim</label><mixed-citation>
      
Turmaganbet, U., Zhexebay, D., Turlykozhayeva, D., Skabylov, A., Akhtanov, S.,
Temesheva, S., Tao, M., and Masalim, P.: Thermal infrared object detection
with YOLO models, Eurasian Physical Technical Journal, 22, 121–132,
<a href="https://doi.org/10.31489/2025n2/121-132" target="_blank">https://doi.org/10.31489/2025n2/121-132</a>, 2025.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib17"><label>Ultralytics(2023)</label><mixed-citation>
      
Ultralytics: YOLOv8 Architecture, Ultralytics Docs,
<a href="https://yolov8.org/yolov8-architecture/" target="_blank"/> (last access: 12 September 2025),
2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib18"><label>Wang et al.(2024)Wang, Ren, and Zhang</label><mixed-citation>
      
Wang, T., Ren, S., and Zhang, H.: Nighttime wildlife object detection based on
YOLOv8‐night, Electronics Letters, 60, <a href="https://doi.org/10.1049/ell2.13305" target="_blank">https://doi.org/10.1049/ell2.13305</a>, 2024.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib19"><label>Wang et al.(2019)Wang, Yin, Zhang, and Yan</label><mixed-citation>
      
Wang, X., Yin, J., Zhang, K., and Yan, J.: Research on dispersion phenomenon of
infrared imaging system based on black body, Optik, 185, 405–413,
<a href="https://doi.org/10.1016/j.ijleo.2019.03.115" target="_blank">https://doi.org/10.1016/j.ijleo.2019.03.115</a>, 2019.

    </mixed-citation></ref-html>--></article>
