Trade-off between deep learning for species identification and inference about predator-prey co-occurrence

Reproducible R workflow integrating models in computer vision and ecological statistics

Creative Commons BY License ISSN 2824-7795

Authors
Affiliations
Maëlis Kervellec
Jean-Baptiste Fanjul
Anna Chaine
Lucile Marescot
Yoann Bollet
Christophe Duchamp
Published

April 22, 2022

Modified

July 1, 2025

Keywords

computer-vision, deep-learning, species-distribution-modeling, ecological-statistics

Status
build status

reviews
Abstract

Deep learning is used in computer vision problems with important applications in several scientific fields. In ecology for example, there is a growing interest in deep learning for automatizing repetitive analyses on large amounts of images, such as animal species identification. However, there are challenging issues toward the wide adoption of deep learning by the community of ecologists. First, there is a programming barrier as most algorithms are written in Python while most ecologists are versed in R. Second, recent applications of deep learning in ecology have focused on computational aspects and simple tasks without addressing the underlying ecological questions or carrying out the statistical data analysis to answer these questions. Here, we showcase a reproducible R workflow integrating both deep learning and statistical models using predator-prey relationships as a case study. We illustrate deep learning for the identification of animal species on images collected with camera traps, and quantify spatial co-occurrence using multispecies occupancy models. Despite average model classification performances, ecological inference was similar whether we analysed the ground truth dataset or the classified dataset. This result calls for further work on the trade-offs between time and resources allocated to train models with deep learning and our ability to properly address key ecological questions with biodiversity monitoring. We hope that our reproducible workflow will be useful to ecologists and applied statisticians.

1 Introduction

Computer vision is a field of artificial intelligence in which a machine is taught how to extract and interpret the content of an image (Krizhevsky, Sutskever, and Hinton 2012). Computer vision relies on deep learning that allows computational models to learn from training data – a set of manually labelled images – and make predictions on new data – a set of unlabelled images (Baraniuk, Donoho, and Gavish 2020; LeCun, Bengio, and Hinton 2015). With the growing availability of massive data, computer vision with deep learning is being increasingly used to perform tasks such as object detection, face recognition, action and activity recognition or human pose estimation in fields as diverse as medicine, robotics, transportation, genomics, sports and agriculture (Voulodimos et al. 2018).

In ecology in particular, there is a growing interest in deep learning for automatizing repetitive analyses on large amounts of images, such as identifying plant and animal species, distinguishing individuals of the same or different species, counting individuals or detecting relevant features (Christin, Hervet, and Lecomte 2019; Lamba et al. 2019; Weinstein 2018). By saving hours of manual data analyses and tapping into massive amounts of data that keep accumulating with technological advances, deep learning has the potential to become an essential tool for ecologists and applied statisticians.

Despite the promising future of computer vision and deep learning, there are challenging issues toward their wide adoption by the community of ecologists (e.g. Wearn, Freeman, and Jacoby 2019). First, there is a programming barrier as most algorithms are written in the Python language (but see MXNet in R and the R interface to Keras) while most ecologists are versed in R (Lai et al. 2019). If ecologists are to use computer vision in routine, there is a need for bridges between these two languages (through, e.g., the reticulate package Allaire et al. (2017) or the shiny package Tabak et al. (2020)). Second, ecologists may be reluctant to develop deep learning algorithms that require large amounts of computation time and consequently come with an environmental cost due to carbon emissions (Strubell, Ganesh, and McCallum 2019). Third, recent applications of computer vision via deep learning in ecology have focused on computational aspects and simple tasks without addressing the underlying ecological questions (Sutherland et al. 2013), or carrying out statistical data analysis to answer these questions (Gimenez et al. 2014). Although perfectly understandable given the challenges at hand, we argue that a better integration of the why (ecological questions), the what (automatically labelled images) and the how (statistics) would be beneficial to computer vision for ecology (see also Weinstein 2018).

Here, we showcase a full why-what-how workflow in R using a case study on the structure of an ecological community (a set of co-occurring species) composed of the Eurasian lynx (Lynx lynx) and its two main preys. First, we introduce the case study and motivate the need for deep learning. Second we illustrate deep learning for the identification of animal species in large amounts of images, including model training and validation with a dataset of labelled images, and prediction with a new dataset of unlabelled images. Last, we proceed with the quantification of spatial co-occurrence using statistical models.

2 Collecting images with camera traps

Lynx (Lynx lynx) went extinct in France at the end of the 19th century due to habitat degradation, human persecution and decrease in prey availability (Vandel and Stahl 2005). The species was reintroduced in Switzerland in the 1970s (Breitenmoser 1998), then re-colonised France through the Jura mountains in the 1980s (Vandel and Stahl 2005). The species is listed as endangered under the 2017 IUCN Red list and is of conservation concern in France due to habitat fragmentation, poaching and collisions with vehicles. The Jura holds the bulk of the French lynx population.

To better understand its distribution, we need to quantify its interactions with its main preys, roe deer (Capreolus capreolus) and chamois (Rupicapra rupicapra) (Molinari-Jobin et al. 2007), two ungulate species that are also hunted. To assess the relative contribution of predation and hunting to the community structure and dynamics, a predator-prey program was set up jointly by the French Office for Biodiversity, the Federations of Hunters from the Jura, Ain and Haute-Savoie counties and the French National Centre for Scientific Research. Animal detections were made using a set of camera traps in the Jura mountains that were deployed in the Jura and Ain counties (see Figure 1). Altitude in the Jura site ranges from 520m to 1150m, and from 400m to 950m for the Ain site. Woodland areas cover 69% of the Ain site, with deciduous forests (63%) followed by coniferous (19.5%) and mixed forest (12.5%). In the Jura site, woodland areas cover 62% of the area, with mixed forests (46.6%), deciduous forests (37.3%) and coniferous (14%). In both sites, the remaining habitat is meadows used by cattle.

We divided the two study areas into grids of 2.7 \times 2.7 km cells or sites hereafter (Zimmermann et al. 2013) in which we set two camera traps per site (Xenon white flash with passive infrared trigger mechanisms, model Capture, Ambush and Attack; Cuddeback), with 18 sites in the Jura study area, and 11 in the Ain study area that were active over the study period (from February 2016 to October 2017 for the Jura county, and from February 2017 to May 2019 for the Ain county). The location of camera traps was chosen to maximise lynx detection. More precisely, camera traps were set up along large paths in the forest, on each side of the path at 50cm high. Camera traps were checked weekly to change memory cards, batteries and to remove fresh snow after heavy snowfall.

Figure 1: Study area, grid and camera trap locations.

In total, 45563 and 18044 pictures were considered in the Jura and Ain sites respectively after manually droping empty pictures and pictures with unidentified species. Note that classifying empty images could be automatised with deep learning (Norouzzadeh et al. 2021; Tabak et al. 2020). We identified the species present on all images by hand (see Table 1) using digiKam a free open-source digital photo management application (https://www.digikam.org/). This operation took several weeks of labor full time, which is often identified as a limitation of camera trap studies. To expedite this tedious task, computer vision with deep learning has been identified as a promising approach (Norouzzadeh et al. 2021; Tabak et al. 2019; Willi et al. 2019).

Table 1: Species identified in the Jura and Ain study sites with samples size (n). Only first 10 species with most images are shown.
Species in Jura study site n Species in Ain study site n
human 31644 human 4946
vehicule 5637 vehicule 4454
dog 2779 dog 2310
fox 2088 fox 1587
chamois 919 rider 1025
wild boar 522 roe deer 860
badger 401 chamois 780
roe deer 368 hunter 593
cat 343 wild boar 514
lynx 302 badger 461

3 Deep learning for species identification

Using the images we obtained with camera traps (Table 1), we trained a model for identifying species using the Jura study site as a calibration dataset. We then assessed this model’s ability to automatically identify species on a new dataset, also known as transferability, using the Ain study site as an evaluation dataset. Even though in the present work we quantified co-occurrence between lynx and its prey, we included other species in the training to investigate the structure and dynamics of the entire community in future work. Also, the use of specific species categories instead of just a “other” category besides the focal species should help the algorithm to determine with better confidence when a picture does not contain a focal species in situations where there is no doubt that this is another species (think of a vehicle for example), or where a species is detected with which a focal species can be confused, e.g. lynx with fox.

3.1 Training - Jura study site

We selected at random 80% of the annotated images for each species in the Jura study site for training, and 20% for testing. We applied various transformations (flipping, brightness and contrast modifications following Shorten and Khoshgoftaar (2019)) to improve training (see Appendix). To reduce model training time and overcome the small number of images, we used transfer learning (Yosinski et al. 2014; Shao, Zhu, and Li 2015) and considered a pre-trained model as a starting point. Specifically, we trained a deep convolutional neural network (ResNet-50) architecture (He et al. 2016) using the fastai library (https://docs.fast.ai/) that implements the PyTorch library (Paszke et al. 2019). Interestingly, the fastai library comes with an R interface (https://eagerai.github.io/fastai/) that uses the reticulate package to communicate with Python, therefore allowing R users to access up-to-date deep learning tools. We trained models on the Montpellier Bioinformatics Biodiversity platform using a GPU machine (Titan Xp nvidia) with 16Go of RAM. We used 20 epochs which took approximately 10 hours. The computational burden prevented us from providing a full reproducible analysis, but we do so with a subsample of the dataset in the Appendix. All trained models are available from https://doi.org/10.5281/zenodo.5164796.

Using the testing dataset, we calculated three metrics to evaluate our model performance at correctly identifying species (e.g. Duggan et al. 2021). Specifically, we relied on accuracy the ratio of correct predictions to the total number of predictions, recall a measure of false negatives (FN; e.g. an image with a lynx for which our model predicts another species) with recall = TP / (TP + FN) where TP is for true positives, and precision a measure of false positives (FP; e.g. an image with any species but a lynx for which our model predicts a lynx) with precision = TP / (TP + FP). In camera trap studies, a strategy (Duggan et al. 2021) consists in optimizing precision if the focus is on rare species (lynx), while recall should be optimized if the focus is on commom species (chamois and roe deer).

We achieved 85% accuracy during training. Our model had good performances for the three classes we were interested in, with 87% precision for lynx and 81% recall for both roe deer and chamois (Table 2).

Table 2: Model performance metrics. Images from the Jura study site were used for training.
species precision recall
badger 0.78 0.88
red deer 0.67 0.21
chamois 0.86 0.81
cat 0.89 0.78
roe deer 0.67 0.81
dog 0.78 0.84
human 0.99 0.79
hare 0.32 0.52
lynx 0.87 0.95
fox 0.85 0.90
wild boar 0.93 0.88
vehicule 0.95 0.98

3.2 Transferability - Ain study site

We evaluated transferability for our trained model by predicting species on images from the Ain study site which were not used for training. Precision was 77% for lynx, and while we achieved 86% recall for roe deer, our model performed poorly for chamois with 8% recall (Table 3).

Table 3: Model transferability performance. Images from the Ain study site were used for assessing transferability.
precision recall
badger 0.71 0.89
rider 0.79 0.92
red deer 0.00 0.00
chamois 0.82 0.08
hunter 0.17 0.11
cat 0.46 0.59
roe deer 0.67 0.86
dog 0.77 0.35
human 0.51 0.93
hare 0.37 0.35
lynx 0.77 0.89
marten 0.05 0.04
fox 0.90 0.53
wild boar 0.75 0.94
cow 0.01 0.25
vehicule 0.94 0.51

To better understand this pattern, we display the results under the form of a confusion matrix that compares model classifications to manual classifications (Figure 2). There were a lot of false negatives for chamois, meaning that when a chamois was present in an image, it was often classified as another species by our model.

Figure 2: Confusion matrix comparing automatic to manual species classifications. Species that were predicted by our model are in columns, and species that are actually in the images are in rows. Column and row percentages are also provided at the bottom and right side of each cells respectively. An example of column percentage is as follows: of all pictures for which we predict a wild boar, 75.1% actually contained a wild boar. An example of row percentage is as follows: of all pictures in which we have a wild boar, we predict 94% of them to be badgers.

Overall, our model trained on images from the Jura study site did poorly at correctly predicting species on images from the Ain study site. This result does not come as a surprise, as generalizing classification algorithms to new environments is known to be difficult (Beery, Horn, and Perona 2018). While a computer scientist might be disappointed in these results, an ecologist would probably wonder whether ecological inference about the co-occurrence between lynx and its prey is biased by these average performances, a question we address in the next section.

4 Spatial co-occurrence

Here, we analysed the data we acquired from the previous section. For the sake of comparison, we considered two datasets, one made of the images manually labelled for both the Jura and Ain study sites pooled together (ground truth dataset), and the other in which we pooled the images that were manually labelled for the Jura study site and the images that were automatically labelled for the Ain study site using our trained model (classified dataset).

We formatted the data by generating monthly detection histories, that is a sequence of detections (Y_{sit} = 1) and non-detections (Y_{sit} = 0), for species s at site i and sampling occasion t (see Figure 3).

Figure 3: Detections (black) and non-detections (light grey) for each of the 3 species lynx, chamois and roe deer between March and November for all years pooled together. Sites are on the Y axis, while sampling occasions are on the X axis. Only data from the ground truth dataset are displayed.

To quantify spatial co-occurrence betwen lynx and its preys, we used a multispecies occupancy modeling approach (Rota et al. 2016) implemented in the R package unmarked (Fiske and Chandler 2011) within the maximum likelihood framework. The multispecies occupancy model assumes that observations y_{sit}, conditional on Z_{si} the latent occupancy state of species s at site i are drawn from Bernoulli random variables Y_{sit} | Z_{si} \sim \text{Bernoulli}(Z_{si}p_{sit}) where p_{sit} is the detection probability of species s at site i and sampling occasion t. Detection probabilities can be modeled as a function of site and/or sampling covariates, or the presence/absence of other species, but for the sake of illustration, we will make them only species-specific here.

The latent occupancy states are assumed to be distributed as multivariate Bernoulli random variables (Dai, Ding, and Wahba 2013). Let us consider 2 species, species 1 and 2, then Z_i = (Z_{i1}, Z_{i2}) \sim \text{multivariate Bernoulli}(\psi_{11}, \psi_{10}, \psi_{01}, \psi_{00}) where \psi_{11} is the probability that a site is occupied by both species 1 and 2, \psi_{10} the probability that a site is occupied by species 1 but not 2, \psi_{01} the probability that a site is occupied by species 2 but not 1, and \psi_{00} the probability a site is occupied by none of them. Note that we considered species-specific only occupancy probabilities but these could be modeled as site-specific covariates. Marginal occupancy probabilities are obtained as \Pr(Z_{i1}=1) = \psi_{11} + \psi_{10} and \Pr(Z_{i2}=1) = \psi_{11} + \psi_{01}. With this model, we may also infer co-occurrence by calculating conditional probabilities such as for example the probability of a site being occupied by species 2 conditional of species 1 with \Pr(Z_{i2} = 1| Z_{i1} = 1) = \displaystyle{\frac{\psi_{11}}{\psi_{11}+\psi_{10}}}.

Despite its appeal and increasing use in ecology, multispecies occupancy models can be difficult to fit to real-world data in practice. First, these models are data-hungry and regularization methods (Clipp et al. 2021) are needed to avoid occupancy probabilities to be estimated at the boundary of the parameter space or with large uncertainty. Second, and this is true for any joint species distribution models, these models quickly become very complex with many parameters to be estimated when the number of species increases and co-occurrence is allowed between all species. Here, ecological expertise should be used to consider only meaningful species interactions and apply parsimony when parameterizing models.

We now turn to the results obtained from a model with five species namely lynx, chamois, roe deer, fox and cat and co-occurrence allowed between lynx and chamois and roe deer only.

Detection probabilities were indistinguishable (at the third decimal) whether we used the ground truth or the classified dataset, with p_{\text{lynx}} = 0.51 (0.45, 0.58), p_{\text{roe deer}} = 0.63 (0.57, 0.68) and p_{\text{chamois}} = 0.61 (0.55, 0.67).

We also found that occupancy probability estimates were similar whether we used the ground truth or the classified dataset (Figure 4). Roe deer was the most prevalent species, but lynx and chamois were also occurring with high probability (Figure 4). Note that, despite chamois being often misclassified (Figure 2), its marginal occupancy tends to be higher when estimated with the classified dataset. Ecologically speaking, this might well be the case if the correctly classified detections are spread over all camera traps. The difference in marginal occupancy seems however non-significant judging by the overlap between the two confidence intervals.

Figure 4: Marginal occupancy probabilities for all three species (lynx, roe deer and chamois). Parameter estimates are from a multispecies occupancy model using either the ground truth dataset (in red) or the classified dataset (in blue-grey). Note that marginal occupancy probabilities are estimated with high precision for roe deer, which explain why the associated confidence intervals do not show.

Because marginal occupancy probabilities were high, probabilities of co-occurrence were also estimated high (Figure 5). Our results should be interpreted bearing in mind that co-occurrence is a necessary but not sufficient condition for actual interaction. When both preys were present, lynx was more present than when they were both absent (Figure 5). Lynx was more sensitive to the presence of roe deer than that of chamois (Figure 5).

Figure 5: Lynx occupancy probability conditional on the presence or absence of its preys (roe deer and chamois). Parameter estimates are from a multispecies occupancy model using either the ground truth dataset (in red) or the classified dataset (in blue-grey).

Overall, we found similar or higher uncertainty in estimates obtained from the classified dataset (Figure 4 and Figure 5). Sample size being similar for both datasets, we do not have a solid explanation for this pattern.

5 Discussion

In this paper, we aimed at illustrating a reproducible workflow for studying the structure of an animal community and species spatial co-occurrence (why) using images acquired from camera traps and automatically labelled with deep learning (what) which we analysed with statistical occupancy models accounting for imperfect species detection (how). Overall, we found that, even though model transferability could be improved, inference about the co-occurrence of lynx and its preys was similar whether we analysed the ground truth data or classified data.

This result calls for further work on the trade-offs between time and resources allocated to train models with deep learning and our ability to correctly answer key ecological questions with camera-trap surveys. In other words, while a computer scientist might be keen on spending time training models to achieve top performances, an ecologist would rather rely on a model showing average performances and use this time to proceed with statistical analyses if, of course, errors in computer-annotated images do not make ecological inference flawed. The right balance may be found with collaborative projects in which scientists from artificial intelligence, statistics and ecology agree on a common objective, and identify research questions that can pick the interest of all parties.

Our demonstration remains however empirical, and ecological inference might no longer be robust to misclassification if detection and non-detections were pooled weekly or daily, or if more complex models, e.g. including time-varying detection probabilities and/or habitat-specific occupancy probabilities, were fitted to the data. Therefore, we encourage others to try and replicate our results. In that spirit, we praise previous work on plants which used deep learning to produce occurrence data and tested the sensitivity of species distribution models to image classification errors (Botella et al. 2018). We also see two avenues of research that could benefit the integration of deep learning and ecological statistics. First, a simulation study could be conducted to evaluate bias and precision in ecological parameter estimators with regard to errors in image annotation by computers. The outcome of this exercise could be, for example, guidelines informing on the confidence an investigator may place in ecological inference as a function of the amount of false negatives and false positives. Second, annotation errors could be accomodated directly in statistical models. For example, single-species occupancy models account for false negatives when a species is not detected by the camera at a site where it is present, as well as false positives when a species is detected at a site where it is not present due to species misidentification by the observer (Miller et al. 2011). Pending a careful distinction between ecological vs. computer-generated false negatives and false positives, error rates could be added to multispecies occupancy models (Chambert et al. 2018) and informed by recall and precision metrics obtained during model training (Tabak et al. 2020). An alternative quick and dirty approach would consist in adopting a Monte Carlo approach by sampling the species detected or non-detected in each picture according to its predicted probability of belonging to a given class, then building the corresponding dataset and fitting occupancy models to it for each sample.

When it comes to the case study, our results should be discussed with regard to the sensitivity of co-occurrence estimates to errors in automatic species classification. In particular, we expected that confusions between the two prey species might artificially increase the estimated probability of co-occurrence with lynx. This was illustrated by \Pr(\text{lynx present} | \text{roe deer present and chamois absent}) (resp. \Pr(\text{lynx present} | \text{roe deer absent and chamois present})) being estimated higher (resp. lower) with the classified than the ground truth dataset (Figure 5). This pattern could be explained by chamois being often classified as (and confused with) roe deer (Figure 2).

Our results are only preliminary and we see several perspectives to our work. First, we focused our analysis on lynx and its main prey, while other species should be included to get a better understanding of the community structure. For example, both lynx and fox prey on small rodents and birds and a model including co-occurrence between these two predators showed better support by the data (AIC was 1544 when co-occurrence was included vs. 1557 when it was not). Second, we aim at quantifying the relative contribution of biotic (lynx predation on chamois and roe deer) and abiotic (habitat quality) processes to the composition and dynamic of this ecological community. Third, to benefit future camera trap studies of lynx in the Jura mountains, we plan to train a model again using more manually annotated images from both the Jura and the Ain study sites. These perspectives are the object of ongoing work.

With the rapid advances in technologies for biodiversity monitoring (Lahoz-Monfort and Magrath 2021), the possibility of analysing large amounts of images makes deep learning appealing to ecologists. We hope that our proposal of a reproducible R workflow for deep learning and statistical ecology will encourage further studies in the integration of these disciplines, and contribute to the adoption of computer vision by ecologists.

6 Appendix: Reproducible example of species identification on camera trap images with CPU

In this section, we go through a reproducible example of the entire deep learning workflow, including data preparation, model training, and automatic labeling of new images. We used a subsample of 467 images from the original dataset in the Jura county to allow the training of our model with CPU on a personal computer. We also used 14 images from the original dataset in the Ain county to illustrate prediction.

6.1 Training and validation datasets

We first split the dataset of Jura images in two datasets, a dataset for training, and the other one for validation. We use the exifr package to extract metadata from images, get a list of images names and extract the species from these.

Table 4: Species considered, and number of images with these species in them.
Hide/Show the code
library(exifr)
pix_folder <- 'pix/pixJura/'
file_list <- list.files(path = pix_folder,
                        recursive = TRUE,
                        pattern = "*.jpg",
                        full.names = TRUE)
labels <-
  read_exif(file_list) %>%
  as_tibble() %>%
  unnest(Keywords, keep_empty = TRUE) %>% # keep_empty = TRUE keeps pix with no labels (empty pix)
  group_by(SourceFile) %>%
  slice_head() %>% # when several labels in a pix, keep first only
  ungroup() %>%
  mutate(Keywords = as_factor(Keywords)) %>%
  mutate(Keywords = fct_explicit_na(Keywords, "wo_tag")) %>% # when pix has no tag
  select(SourceFile, FileName, Keywords) %>%
  mutate(Keywords = fct_recode(Keywords,
                               "chat" = "chat forestier",
                               "lievre" = "lièvre",
                               "vehicule" = "véhicule",
                               "ni" = "Non identifié")) %>%
  filter(!(Keywords %in% c("ni", "wo_tag")))
Keywords n
humain 143
vehicule 135
renard 58
sangliers 33
chasseur 17
chien 14
lynx 13
chevreuil 13
chamois 12
blaireaux 10
chat 8
lievre 4
fouine 1
cavalier 1

Then we pick 80\% of the images for training in each category, the rest being used for validation.

Hide/Show the code
# training dataset
pix_train <- labels %>%
  select(SourceFile, FileName, Keywords) %>%
  group_by(Keywords) %>%
  filter(between(row_number(), 1, floor(n()*80/100))) # 80% per category
# validation dataset
pix_valid <- labels %>%
  group_by(Keywords) %>%
  filter(between(row_number(), floor(n()*80/100) + 1, n()))

Eventually, we store these images in two distinct directories named train and valid.

Hide/Show the code
# create dir train/ and copy pix there, organised by categories
dir.create('pix/train') # create training directory
for (i in levels(fct_drop(pix_train$Keywords))) dir.create(paste0('pix/train/',i)) # create dir for labels
for (i in 1:nrow(pix_train)){
    file.copy(as.character(pix_train$SourceFile[i]),
              paste0('pix/train/', as.character(pix_train$Keywords[i]))) # copy pix in corresp dir
}
# create dir valid/ and copy pix there, organised by categories.
dir.create('pix/valid') # create validation dir
for (i in levels(fct_drop(pix_train$Keywords))) dir.create(paste0('pix/valid/',i)) # create dir for labels
for (i in 1:nrow(pix_valid)){
    file.copy(as.character(pix_valid$SourceFile[i]),
              paste0('pix/valid/', as.character(pix_valid$Keywords[i]))) # copy pix in corresp dir
}
# delete pictures in valid/ directory for which we did not train the model
to_be_deleted <- setdiff(levels(fct_drop(pix_valid$Keywords)), levels(fct_drop(pix_train$Keywords)))
if (!is_empty(to_be_deleted)) {
  for (i in 1:length(to_be_deleted)){
    unlink(paste0('pix/valid/', to_be_deleted[i]))
  }
}

What is the sample size of these two datasets?

Hide/Show the code
bind_rows("training" = pix_train, "validation" = pix_valid, .id = "dataset") %>%
  group_by(dataset) %>%
  count(Keywords) %>%
  rename(category = Keywords) %>%
  kable() %>%
  kable_styling()
Table 5: Sample size (n) for the training and validation datasets.
dataset category n
training humain 114
training vehicule 108
training chamois 9
training blaireaux 8
training sangliers 26
training renard 46
training chasseur 13
training lynx 10
training chien 11
training chat 6
training chevreuil 10
training lievre 3
validation humain 29
validation vehicule 27
validation chamois 3
validation blaireaux 2
validation sangliers 7
validation renard 12
validation chasseur 4
validation lynx 3
validation chien 3
validation fouine 1
validation chat 2
validation chevreuil 3
validation lievre 1
validation cavalier 1

6.2 Transfer learning

We proceed with transfer learning using images from the Jura county (or a subsample more exactly). We first load images and apply standard transformations to improve training (flip, rotate, zoom, rotate, light transform).

Hide/Show the code
library(reticulate)
#reticulate::use_condaenv("gimenez")
library(fastai)
dls <- ImageDataLoaders_from_folder(
  path = "pix/",
  train = "train",
  valid = "valid",
  item_tfms = Resize(size = 460),
  bs = 10,
  batch_tfms = list(aug_transforms(size = 224,
                                   min_scale = 0.75), # transformation
                    Normalize_from_stats( imagenet_stats() )),
  num_workers = 0,
  ImageFile.LOAD_TRUNCATED_IMAGES = TRUE)

Then we get the model architecture. For the sake of illustration, we use a resnet18 here, but we used a resnet50 to get the full results presented in the main text.

--2025-07-01 13:28:49--  https://download.pytorch.org/models/resnet18-f37072fd.pth
Resolving download.pytorch.org (download.pytorch.org)... 18.164.174.18, 18.164.174.78, 18.164.174.99, ...
Connecting to download.pytorch.org (download.pytorch.org)|18.164.174.18|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 46830571 (45M) [application/x-www-form-urlencoded]
Saving to: ‘resnet18-f37072fd.pth’

     0K .......... .......... .......... .......... ..........  0% 5.34M 8s
    50K .......... .......... .......... .......... ..........  0% 10.7M 6s
   100K .......... .......... .......... .......... ..........  0% 10.9M 6s
   150K .......... .......... .......... .......... ..........  0% 14.7M 5s
   200K .......... .......... .......... .......... ..........  0% 14.0M 5s
   250K .......... .......... .......... .......... ..........  0% 27.5M 4s
   300K .......... .......... .......... .......... ..........  0% 29.8M 4s
   350K .......... .......... .......... .......... ..........  0% 20.1M 3s
   400K .......... .......... .......... .......... ..........  0% 47.8M 3s
   450K .......... .......... .......... .......... ..........  1% 28.2M 3s
   500K .......... .......... .......... .......... ..........  1% 44.8M 3s
   550K .......... .......... .......... .......... ..........  1% 57.5M 3s
   600K .......... .......... .......... .......... ..........  1% 68.9M 3s
   650K .......... .......... .......... .......... ..........  1% 40.3M 2s
   700K .......... .......... .......... .......... ..........  1% 81.8M 2s
   750K .......... .......... .......... .......... ..........  1% 42.7M 2s
   800K .......... .......... .......... .......... ..........  1% 73.9M 2s
   850K .......... .......... .......... .......... ..........  1% 68.4M 2s
   900K .......... .......... .......... .......... ..........  2% 81.9M 2s
   950K .......... .......... .......... .......... ..........  2% 44.4M 2s
  1000K .......... .......... .......... .......... ..........  2% 87.2M 2s
  1050K .......... .......... .......... .......... ..........  2%  141M 2s
  1100K .......... .......... .......... .......... ..........  2%  120M 2s
  1150K .......... .......... .......... .......... ..........  2% 81.6M 2s
  1200K .......... .......... .......... .......... ..........  2% 67.4M 2s
  1250K .......... .......... .......... .......... ..........  2%  123M 2s
  1300K .......... .......... .......... .......... ..........  2%  137M 2s
  1350K .......... .......... .......... .......... ..........  3%  208M 1s
  1400K .......... .......... .......... .......... ..........  3% 79.9M 1s
  1450K .......... .......... .......... .......... ..........  3% 93.6M 1s
  1500K .......... .......... .......... .......... ..........  3% 95.6M 1s
  1550K .......... .......... .......... .......... ..........  3%  107M 1s
  1600K .......... .......... .......... .......... ..........  3%  181M 1s
  1650K .......... .......... .......... .......... ..........  3%  178M 1s
  1700K .......... .......... .......... .......... ..........  3%  104M 1s
  1750K .......... .......... .......... .......... ..........  3%  157M 1s
  1800K .......... .......... .......... .......... ..........  4%  130M 1s
  1850K .......... .......... .......... .......... ..........  4%  242M 1s
  1900K .......... .......... .......... .......... ..........  4%  123M 1s
  1950K .......... .......... .......... .......... ..........  4% 74.9M 1s
  2000K .......... .......... .......... .......... ..........  4%  257M 1s
  2050K .......... .......... .......... .......... ..........  4%  122M 1s
  2100K .......... .......... .......... .......... ..........  4%  376M 1s
  2150K .......... .......... .......... .......... ..........  4%  245M 1s
  2200K .......... .......... .......... .......... ..........  4%  144M 1s
  2250K .......... .......... .......... .......... ..........  5%  312M 1s
  2300K .......... .......... .......... .......... ..........  5%  184M 1s
  2350K .......... .......... .......... .......... ..........  5%  136M 1s
  2400K .......... .......... .......... .......... ..........  5%  290M 1s
  2450K .......... .......... .......... .......... ..........  5%  116M 1s
  2500K .......... .......... .......... .......... ..........  5%  233M 1s
  2550K .......... .......... .......... .......... ..........  5%  368M 1s
  2600K .......... .......... .......... .......... ..........  5%  261M 1s
  2650K .......... .......... .......... .......... ..........  5%  175M 1s
  2700K .......... .......... .......... .......... ..........  6%  228M 1s
  2750K .......... .......... .......... .......... ..........  6%  191M 1s
  2800K .......... .......... .......... .......... ..........  6%  322M 1s
  2850K .......... .......... .......... .......... ..........  6%  177M 1s
  2900K .......... .......... .......... .......... ..........  6%  260M 1s
  2950K .......... .......... .......... .......... ..........  6%  133M 1s
  3000K .......... .......... .......... .......... ..........  6%  268M 1s
  3050K .......... .......... .......... .......... ..........  6%  435M 1s
  3100K .......... .......... .......... .......... ..........  6%  176M 1s
  3150K .......... .......... .......... .......... ..........  6%  292M 1s
  3200K .......... .......... .......... .......... ..........  7%  177M 1s
  3250K .......... .......... .......... .......... ..........  7%  247M 1s
  3300K .......... .......... .......... .......... ..........  7%  343M 1s
  3350K .......... .......... .......... .......... ..........  7%  468M 1s
  3400K .......... .......... .......... .......... ..........  7%  280M 1s
  3450K .......... .......... .......... .......... ..........  7%  314M 1s
  3500K .......... .......... .......... .......... ..........  7%  222M 1s
  3550K .......... .......... .......... .......... ..........  7%  219M 1s
  3600K .......... .......... .......... .......... ..........  7%  470M 1s
  3650K .......... .......... .......... .......... ..........  8%  167M 1s
  3700K .......... .......... .......... .......... ..........  8% 66.5M 1s
  3750K .......... .......... .......... .......... ..........  8%  448M 1s
  3800K .......... .......... .......... .......... ..........  8% 88.5M 1s
  3850K .......... .......... .......... .......... ..........  8%  421M 1s
  3900K .......... .......... .......... .......... ..........  8%  462M 1s
  3950K .......... .......... .......... .......... ..........  8%  128M 1s
  4000K .......... .......... .......... .......... ..........  8%  464M 1s
  4050K .......... .......... .......... .......... ..........  8%  200M 1s
  4100K .......... .......... .......... .......... ..........  9%  453M 1s
  4150K .......... .......... .......... .......... ..........  9%  455M 1s
  4200K .......... .......... .......... .......... ..........  9%  217M 1s
  4250K .......... .......... .......... .......... ..........  9%  434M 1s
  4300K .......... .......... .......... .......... ..........  9%  462M 1s
  4350K .......... .......... .......... .......... ..........  9%  353M 1s
  4400K .......... .......... .......... .......... ..........  9%  192M 1s
  4450K .......... .......... .......... .......... ..........  9%  467M 1s
  4500K .......... .......... .......... .......... ..........  9%  228M 1s
  4550K .......... .......... .......... .......... .......... 10%  449M 1s
  4600K .......... .......... .......... .......... .......... 10%  477M 1s
  4650K .......... .......... .......... .......... .......... 10%  461M 1s
  4700K .......... .......... .......... .......... .......... 10%  202M 1s
  4750K .......... .......... .......... .......... .......... 10%  353M 1s
  4800K .......... .......... .......... .......... .......... 10%  460M 1s
  4850K .......... .......... .......... .......... .......... 10%  478M 1s
  4900K .......... .......... .......... .......... .......... 10%  217M 1s
  4950K .......... .......... .......... .......... .......... 10%  445M 1s
  5000K .......... .......... .......... .......... .......... 11%  481M 1s
  5050K .......... .......... .......... .......... .......... 11%  199M 1s
  5100K .......... .......... .......... .......... .......... 11%  464M 1s
  5150K .......... .......... .......... .......... .......... 11%  355M 0s
  5200K .......... .......... .......... .......... .......... 11%  466M 0s
  5250K .......... .......... .......... .......... .......... 11%  213M 0s
  5300K .......... .......... .......... .......... .......... 11%  438M 0s
  5350K .......... .......... .......... .......... .......... 11%  484M 0s
  5400K .......... .......... .......... .......... .......... 11%  197M 0s
  5450K .......... .......... .......... .......... .......... 12%  428M 0s
  5500K .......... .......... .......... .......... .......... 12%  475M 0s
  5550K .......... .......... .......... .......... .......... 12%  179M 0s
  5600K .......... .......... .......... .......... .......... 12%  471M 0s
  5650K .......... .......... .......... .......... .......... 12%  458M 0s
  5700K .......... .......... .......... .......... .......... 12%  203M 0s
  5750K .......... .......... .......... .......... .......... 12%  445M 0s
  5800K .......... .......... .......... .......... .......... 12%  469M 0s
  5850K .......... .......... .......... .......... .......... 12%  469M 0s
  5900K .......... .......... .......... .......... .......... 13%  215M 0s
  5950K .......... .......... .......... .......... .......... 13%  351M 0s
  6000K .......... .......... .......... .......... .......... 13%  473M 0s
  6050K .......... .......... .......... .......... .......... 13%  200M 0s
  6100K .......... .......... .......... .......... .......... 13%  397M 0s
  6150K .......... .......... .......... .......... .......... 13%  458M 0s
  6200K .......... .......... .......... .......... .......... 13%  481M 0s
  6250K .......... .......... .......... .......... .......... 13%  212M 0s
  6300K .......... .......... .......... .......... .......... 13%  460M 0s
  6350K .......... .......... .......... .......... .......... 13%  360M 0s
  6400K .......... .......... .......... .......... .......... 14%  209M 0s
  6450K .......... .......... .......... .......... .......... 14%  414M 0s
  6500K .......... .......... .......... .......... .......... 14%  479M 0s
  6550K .......... .......... .......... .......... .......... 14%  461M 0s
  6600K .......... .......... .......... .......... .......... 14%  214M 0s
  6650K .......... .......... .......... .......... .......... 14%  457M 0s
  6700K .......... .......... .......... .......... .......... 14%  472M 0s
  6750K .......... .......... .......... .......... .......... 14%  175M 0s
  6800K .......... .......... .......... .......... .......... 14%  462M 0s
  6850K .......... .......... .......... .......... .......... 15%  485M 0s
  6900K .......... .......... .......... .......... .......... 15%  460M 0s
  6950K .......... .......... .......... .......... .......... 15%  216M 0s
  7000K .......... .......... .......... .......... .......... 15%  469M 0s
  7050K .......... .......... .......... .......... .......... 15%  469M 0s
  7100K .......... .......... .......... .......... .......... 15%  391M 0s
  7150K .......... .......... .......... .......... .......... 15%  185M 0s
  7200K .......... .......... .......... .......... .......... 15%  467M 0s
  7250K .......... .......... .......... .......... .......... 15%  219M 0s
  7300K .......... .......... .......... .......... .......... 16%  449M 0s
  7350K .......... .......... .......... .......... .......... 16%  471M 0s
  7400K .......... .......... .......... .......... .......... 16%  462M 0s
  7450K .......... .......... .......... .......... .......... 16%  202M 0s
  7500K .......... .......... .......... .......... .......... 16%  454M 0s
  7550K .......... .......... .......... .......... .......... 16%  352M 0s
  7600K .......... .......... .......... .......... .......... 16%  212M 0s
  7650K .......... .......... .......... .......... .......... 16%  446M 0s
  7700K .......... .......... .......... .......... .......... 16%  474M 0s
  7750K .......... .......... .......... .......... .......... 17%  414M 0s
  7800K .......... .......... .......... .......... .......... 17%  219M 0s
  7850K .......... .......... .......... .......... .......... 17%  440M 0s
  7900K .......... .......... .......... .......... .......... 17%  453M 0s
  7950K .......... .......... .......... .......... .......... 17%  189M 0s
  8000K .......... .......... .......... .......... .......... 17%  440M 0s
  8050K .......... .......... .......... .......... .......... 17%  483M 0s
  8100K .......... .......... .......... .......... .......... 17%  206M 0s
  8150K .......... .......... .......... .......... .......... 17%  415M 0s
  8200K .......... .......... .......... .......... .......... 18%  476M 0s
  8250K .......... .......... .......... .......... .......... 18%  218M 0s
  8300K .......... .......... .......... .......... .......... 18%  435M 0s
  8350K .......... .......... .......... .......... .......... 18%  356M 0s
  8400K .......... .......... .......... .......... .......... 18%  459M 0s
  8450K .......... .......... .......... .......... .......... 18%  200M 0s
  8500K .......... .......... .......... .......... .......... 18%  448M 0s
  8550K .......... .......... .......... .......... .......... 18%  480M 0s
  8600K .......... .......... .......... .......... .......... 18%  213M 0s
  8650K .......... .......... .......... .......... .......... 19%  445M 0s
  8700K .......... .......... .......... .......... .......... 19%  471M 0s
  8750K .......... .......... .......... .......... .......... 19%  318M 0s
  8800K .......... .......... .......... .......... .......... 19%  461M 0s
  8850K .......... .......... .......... .......... .......... 19%  472M 0s
  8900K .......... .......... .......... .......... .......... 19%  210M 0s
  8950K .......... .......... .......... .......... .......... 19%  437M 0s
  9000K .......... .......... .......... .......... .......... 19%  452M 0s
  9050K .......... .......... .......... .......... .......... 19%  475M 0s
  9100K .......... .......... .......... .......... .......... 20%  480M 0s
  9150K .......... .......... .......... .......... .......... 20%  185M 0s
  9200K .......... .......... .......... .......... .......... 20%  433M 0s
  9250K .......... .......... .......... .......... .......... 20%  455M 0s
  9300K .......... .......... .......... .......... .......... 20%  484M 0s
  9350K .......... .......... .......... .......... .......... 20%  476M 0s
  9400K .......... .......... .......... .......... .......... 20%  209M 0s
  9450K .......... .......... .......... .......... .......... 20%  448M 0s
  9500K .......... .......... .......... .......... .......... 20%  397M 0s
  9550K .......... .......... .......... .......... .......... 20%  183M 0s
  9600K .......... .......... .......... .......... .......... 21%  456M 0s
  9650K .......... .......... .......... .......... .......... 21%  478M 0s
  9700K .......... .......... .......... .......... .......... 21%  209M 0s
  9750K .......... .......... .......... .......... .......... 21%  448M 0s
  9800K .......... .......... .......... .......... .......... 21%  488M 0s
  9850K .......... .......... .......... .......... .......... 21%  200M 0s
  9900K .......... .......... .......... .......... .......... 21%  447M 0s
  9950K .......... .......... .......... .......... .......... 21%  357M 0s
 10000K .......... .......... .......... .......... .......... 21%  214M 0s
 10050K .......... .......... .......... .......... .......... 22%  441M 0s
 10100K .......... .......... .......... .......... .......... 22%  450M 0s
 10150K .......... .......... .......... .......... .......... 22%  472M 0s
 10200K .......... .......... .......... .......... .......... 22%  200M 0s
 10250K .......... .......... .......... .......... .......... 22%  440M 0s
 10300K .......... .......... .......... .......... .......... 22%  470M 0s
 10350K .......... .......... .......... .......... .......... 22%  185M 0s
 10400K .......... .......... .......... .......... .......... 22%  463M 0s
 10450K .......... .......... .......... .......... .......... 22%  472M 0s
 10500K .......... .......... .......... .......... .......... 23%  197M 0s
 10550K .......... .......... .......... .......... .......... 23%  462M 0s
 10600K .......... .......... .......... .......... .......... 23%  464M 0s
 10650K .......... .......... .......... .......... .......... 23%  471M 0s
 10700K .......... .......... .......... .......... .......... 23%  215M 0s
 10750K .......... .......... .......... .......... .......... 23%  341M 0s
 10800K .......... .......... .......... .......... .......... 23%  481M 0s
 10850K .......... .......... .......... .......... .......... 23%  197M 0s
 10900K .......... .......... .......... .......... .......... 23%  469M 0s
 10950K .......... .......... .......... .......... .......... 24%  472M 0s
 11000K .......... .......... .......... .......... .......... 24%  222M 0s
 11050K .......... .......... .......... .......... .......... 24%  431M 0s
 11100K .......... .......... .......... .......... .......... 24%  458M 0s
 11150K .......... .......... .......... .......... .......... 24%  356M 0s
 11200K .......... .......... .......... .......... .......... 24%  201M 0s
 11250K .......... .......... .......... .......... .......... 24%  442M 0s
 11300K .......... .......... .......... .......... .......... 24%  479M 0s
 11350K .......... .......... .......... .......... .......... 24%  208M 0s
 11400K .......... .......... .......... .......... .......... 25%  476M 0s
 11450K .......... .......... .......... .......... .......... 25%  478M 0s
 11500K .......... .......... .......... .......... .......... 25%  176M 0s
 11550K .......... .......... .......... .......... .......... 25%  349M 0s
 11600K .......... .......... .......... .......... .......... 25%  463M 0s
 11650K .......... .......... .......... .......... .......... 25%  479M 0s
 11700K .......... .......... .......... .......... .......... 25%  214M 0s
 11750K .......... .......... .......... .......... .......... 25%  448M 0s
 11800K .......... .......... .......... .......... .......... 25%  480M 0s
 11850K .......... .......... .......... .......... .......... 26%  199M 0s
 11900K .......... .......... .......... .......... .......... 26%  462M 0s
 11950K .......... .......... .......... .......... .......... 26%  361M 0s
 12000K .......... .......... .......... .......... .......... 26%  212M 0s
 12050K .......... .......... .......... .......... .......... 26%  463M 0s
 12100K .......... .......... .......... .......... .......... 26%  459M 0s
 12150K .......... .......... .......... .......... .......... 26%  204M 0s
 12200K .......... .......... .......... .......... .......... 26%  471M 0s
 12250K .......... .......... .......... .......... .......... 26%  466M 0s
 12300K .......... .......... .......... .......... .......... 27%  470M 0s
 12350K .......... .......... .......... .......... .......... 27%  174M 0s
 12400K .......... .......... .......... .......... .......... 27%  467M 0s
 12450K .......... .......... .......... .......... .......... 27%  215M 0s
 12500K .......... .......... .......... .......... .......... 27%  407M 0s
 12550K .......... .......... .......... .......... .......... 27%  475M 0s
 12600K .......... .......... .......... .......... .......... 27%  460M 0s
 12650K .......... .......... .......... .......... .......... 27%  213M 0s
 12700K .......... .......... .......... .......... .......... 27%  444M 0s
 12750K .......... .......... .......... .......... .......... 27%  343M 0s
 12800K .......... .......... .......... .......... .......... 28%  196M 0s
 12850K .......... .......... .......... .......... .......... 28%  446M 0s
 12900K .......... .......... .......... .......... .......... 28%  476M 0s
 12950K .......... .......... .......... .......... .......... 28%  222M 0s
 13000K .......... .......... .......... .......... .......... 28%  434M 0s
 13050K .......... .......... .......... .......... .......... 28%  461M 0s
 13100K .......... .......... .......... .......... .......... 28%  220M 0s
 13150K .......... .......... .......... .......... .......... 28%  298M 0s
 13200K .......... .......... .......... .......... .......... 28%  452M 0s
 13250K .......... .......... .......... .......... .......... 29%  488M 0s
 13300K .......... .......... .......... .......... .......... 29%  201M 0s
 13350K .......... .......... .......... .......... .......... 29%  467M 0s
 13400K .......... .......... .......... .......... .......... 29%  476M 0s
 13450K .......... .......... .......... .......... .......... 29%  190M 0s
 13500K .......... .......... .......... .......... .......... 29%  449M 0s
 13550K .......... .......... .......... .......... .......... 29%  193M 0s
 13600K .......... .......... .......... .......... .......... 29%  420M 0s
 13650K .......... .......... .......... .......... .......... 29%  477M 0s
 13700K .......... .......... .......... .......... .......... 30%  459M 0s
 13750K .......... .......... .......... .......... .......... 30%  217M 0s
 13800K .......... .......... .......... .......... .......... 30%  403M 0s
 13850K .......... .......... .......... .......... .......... 30%  451M 0s
 13900K .......... .......... .......... .......... .......... 30%  212M 0s
 13950K .......... .......... .......... .......... .......... 30%  349M 0s
 14000K .......... .......... .......... .......... .......... 30%  484M 0s
 14050K .......... .......... .......... .......... .......... 30%  167M 0s
 14100K .......... .......... .......... .......... .......... 30%  388M 0s
 14150K .......... .......... .......... .......... .......... 31%  475M 0s
 14200K .......... .......... .......... .......... .......... 31%  465M 0s
 14250K .......... .......... .......... .......... .......... 31%  482M 0s
 14300K .......... .......... .......... .......... .......... 31%  209M 0s
 14350K .......... .......... .......... .......... .......... 31%  353M 0s
 14400K .......... .......... .......... .......... .......... 31%  472M 0s
 14450K .......... .......... .......... .......... .......... 31%  200M 0s
 14500K .......... .......... .......... .......... .......... 31%  479M 0s
 14550K .......... .......... .......... .......... .......... 31%  464M 0s
 14600K .......... .......... .......... .......... .......... 32%  214M 0s
 14650K .......... .......... .......... .......... .......... 32%  468M 0s
 14700K .......... .......... .......... .......... .......... 32%  460M 0s
 14750K .......... .......... .......... .......... .......... 32%  180M 0s
 14800K .......... .......... .......... .......... .......... 32%  449M 0s
 14850K .......... .......... .......... .......... .......... 32%  462M 0s
 14900K .......... .......... .......... .......... .......... 32%  212M 0s
 14950K .......... .......... .......... .......... .......... 32%  457M 0s
 15000K .......... .......... .......... .......... .......... 32%  482M 0s
 15050K .......... .......... .......... .......... .......... 33%  177M 0s
 15100K .......... .......... .......... .......... .......... 33%  390M 0s
 15150K .......... .......... .......... .......... .......... 33%  359M 0s
 15200K .......... .......... .......... .......... .......... 33%  460M 0s
 15250K .......... .......... .......... .......... .......... 33%  486M 0s
 15300K .......... .......... .......... .......... .......... 33%  209M 0s
 15350K .......... .......... .......... .......... .......... 33%  412M 0s
 15400K .......... .......... .......... .......... .......... 33%  449M 0s
 15450K .......... .......... .......... .......... .......... 33%  200M 0s
 15500K .......... .......... .......... .......... .......... 34%  468M 0s
 15550K .......... .......... .......... .......... .......... 34%  191M 0s
 15600K .......... .......... .......... .......... .......... 34%  422M 0s
 15650K .......... .......... .......... .......... .......... 34%  472M 0s
 15700K .......... .......... .......... .......... .......... 34%  468M 0s
 15750K .......... .......... .......... .......... .......... 34%  190M 0s
 15800K .......... .......... .......... .......... .......... 34%  458M 0s
 15850K .......... .......... .......... .......... .......... 34%  458M 0s
 15900K .......... .......... .......... .......... .......... 34%  217M 0s
 15950K .......... .......... .......... .......... .......... 34%  349M 0s
 16000K .......... .......... .......... .......... .......... 35%  480M 0s
 16050K .......... .......... .......... .......... .......... 35%  207M 0s
 16100K .......... .......... .......... .......... .......... 35%  408M 0s
 16150K .......... .......... .......... .......... .......... 35%  469M 0s
 16200K .......... .......... .......... .......... .......... 35%  214M 0s
 16250K .......... .......... .......... .......... .......... 35%  470M 0s
 16300K .......... .......... .......... .......... .......... 35%  458M 0s
 16350K .......... .......... .......... .......... .......... 35%  181M 0s
 16400K .......... .......... .......... .......... .......... 35%  410M 0s
 16450K .......... .......... .......... .......... .......... 36%  451M 0s
 16500K .......... .......... .......... .......... .......... 36%  212M 0s
 16550K .......... .......... .......... .......... .......... 36%  435M 0s
 16600K .......... .......... .......... .......... .......... 36%  484M 0s
 16650K .......... .......... .......... .......... .......... 36%  218M 0s
 16700K .......... .......... .......... .......... .......... 36%  451M 0s
 16750K .......... .......... .......... .......... .......... 36%  329M 0s
 16800K .......... .......... .......... .......... .......... 36%  448M 0s
 16850K .......... .......... .......... .......... .......... 36%  210M 0s
 16900K .......... .......... .......... .......... .......... 37%  467M 0s
 16950K .......... .......... .......... .......... .......... 37%  440M 0s
 17000K .......... .......... .......... .......... .......... 37%  214M 0s
 17050K .......... .......... .......... .......... .......... 37%  433M 0s
 17100K .......... .......... .......... .......... .......... 37%  417M 0s
 17150K .......... .......... .......... .......... .......... 37%  188M 0s
 17200K .......... .......... .......... .......... .......... 37%  453M 0s
 17250K .......... .......... .......... .......... .......... 37%  475M 0s
 17300K .......... .......... .......... .......... .......... 37%  213M 0s
 17350K .......... .......... .......... .......... .......... 38%  469M 0s
 17400K .......... .......... .......... .......... .......... 38%  473M 0s
 17450K .......... .......... .......... .......... .......... 38%  198M 0s
 17500K .......... .......... .......... .......... .......... 38%  460M 0s
 17550K .......... .......... .......... .......... .......... 38%  348M 0s
 17600K .......... .......... .......... .......... .......... 38%  471M 0s
 17650K .......... .......... .......... .......... .......... 38%  216M 0s
 17700K .......... .......... .......... .......... .......... 38%  465M 0s
 17750K .......... .......... .......... .......... .......... 38%  424M 0s
 17800K .......... .......... .......... .......... .......... 39%  202M 0s
 17850K .......... .......... .......... .......... .......... 39%  481M 0s
 17900K .......... .......... .......... .......... .......... 39%  482M 0s
 17950K .......... .......... .......... .......... .......... 39%  182M 0s
 18000K .......... .......... .......... .......... .......... 39%  483M 0s
 18050K .......... .......... .......... .......... .......... 39%  447M 0s
 18100K .......... .......... .......... .......... .......... 39%  205M 0s
 18150K .......... .......... .......... .......... .......... 39%  259M 0s
 18200K .......... .......... .......... .......... .......... 39%  368M 0s
 18250K .......... .......... .......... .......... .......... 40%  477M 0s
 18300K .......... .......... .......... .......... .......... 40%  266M 0s
 18350K .......... .......... .......... .......... .......... 40%  350M 0s
 18400K .......... .......... .......... .......... .......... 40%  231M 0s
 18450K .......... .......... .......... .......... .......... 40%  453M 0s
 18500K .......... .......... .......... .......... .......... 40%  250M 0s
 18550K .......... .......... .......... .......... .......... 40%  441M 0s
 18600K .......... .......... .......... .......... .......... 40%  479M 0s
 18650K .......... .......... .......... .......... .......... 40%  251M 0s
 18700K .......... .......... .......... .......... .......... 40%  462M 0s
 18750K .......... .......... .......... .......... .......... 41%  193M 0s
 18800K .......... .......... .......... .......... .......... 41%  439M 0s
 18850K .......... .......... .......... .......... .......... 41%  473M 0s
 18900K .......... .......... .......... .......... .......... 41%  219M 0s
 18950K .......... .......... .......... .......... .......... 41%  452M 0s
 19000K .......... .......... .......... .......... .......... 41%  469M 0s
 19050K .......... .......... .......... .......... .......... 41%  220M 0s
 19100K .......... .......... .......... .......... .......... 41%  463M 0s
 19150K .......... .......... .......... .......... .......... 41%  353M 0s
 19200K .......... .......... .......... .......... .......... 42%  210M 0s
 19250K .......... .......... .......... .......... .......... 42%  446M 0s
 19300K .......... .......... .......... .......... .......... 42%  458M 0s
 19350K .......... .......... .......... .......... .......... 42%  238M 0s
 19400K .......... .......... .......... .......... .......... 42%  417M 0s
 19450K .......... .......... .......... .......... .......... 42%  342M 0s
 19500K .......... .......... .......... .......... .......... 42%  363M 0s
 19550K .......... .......... .......... .......... .......... 42%  347M 0s
 19600K .......... .......... .......... .......... .......... 42%  263M 0s
 19650K .......... .......... .......... .......... .......... 43%  462M 0s
 19700K .......... .......... .......... .......... .......... 43%  484M 0s
 19750K .......... .......... .......... .......... .......... 43%  197M 0s
 19800K .......... .......... .......... .......... .......... 43%  346M 0s
 19850K .......... .......... .......... .......... .......... 43%  467M 0s
 19900K .......... .......... .......... .......... .......... 43%  489M 0s
 19950K .......... .......... .......... .......... .......... 43%  229M 0s
 20000K .......... .......... .......... .......... .......... 43%  467M 0s
 20050K .......... .......... .......... .......... .......... 43%  458M 0s
 20100K .......... .......... .......... .......... .......... 44%  197M 0s
 20150K .......... .......... .......... .......... .......... 44%  465M 0s
 20200K .......... .......... .......... .......... .......... 44%  480M 0s
 20250K .......... .......... .......... .......... .......... 44%  216M 0s
 20300K .......... .......... .......... .......... .......... 44%  449M 0s
 20350K .......... .......... .......... .......... .......... 44%  357M 0s
 20400K .......... .......... .......... .......... .......... 44%  199M 0s
 20450K .......... .......... .......... .......... .......... 44%  467M 0s
 20500K .......... .......... .......... .......... .......... 44%  232M 0s
 20550K .......... .......... .......... .......... .......... 45%  301M 0s
 20600K .......... .......... .......... .......... .......... 45%  334M 0s
 20650K .......... .......... .......... .......... .......... 45%  436M 0s
 20700K .......... .......... .......... .......... .......... 45%  471M 0s
 20750K .......... .......... .......... .......... .......... 45%  175M 0s
 20800K .......... .......... .......... .......... .......... 45%  455M 0s
 20850K .......... .......... .......... .......... .......... 45%  463M 0s
 20900K .......... .......... .......... .......... .......... 45%  464M 0s
 20950K .......... .......... .......... .......... .......... 45%  216M 0s
 21000K .......... .......... .......... .......... .......... 46%  302M 0s
 21050K .......... .......... .......... .......... .......... 46%  407M 0s
 21100K .......... .......... .......... .......... .......... 46%  466M 0s
 21150K .......... .......... .......... .......... .......... 46%  185M 0s
 21200K .......... .......... .......... .......... .......... 46%  473M 0s
 21250K .......... .......... .......... .......... .......... 46%  476M 0s
 21300K .......... .......... .......... .......... .......... 46%  231M 0s
 21350K .......... .......... .......... .......... .......... 46%  463M 0s
 21400K .......... .......... .......... .......... .......... 46%  231M 0s
 21450K .......... .......... .......... .......... .......... 47%  463M 0s
 21500K .......... .......... .......... .......... .......... 47%  253M 0s
 21550K .......... .......... .......... .......... .......... 47%  251M 0s
 21600K .......... .......... .......... .......... .......... 47%  312M 0s
 21650K .......... .......... .......... .......... .......... 47%  300M 0s
 21700K .......... .......... .......... .......... .......... 47%  350M 0s
 21750K .......... .......... .......... .......... .......... 47%  314M 0s
 21800K .......... .......... .......... .......... .......... 47%  311M 0s
 21850K .......... .......... .......... .......... .......... 47%  321M 0s
 21900K .......... .......... .......... .......... .......... 47%  319M 0s
 21950K .......... .......... .......... .......... .......... 48%  295M 0s
 22000K .......... .......... .......... .......... .......... 48%  345M 0s
 22050K .......... .......... .......... .......... .......... 48%  315M 0s
 22100K .......... .......... .......... .......... .......... 48%  309M 0s
 22150K .......... .......... .......... .......... .......... 48%  311M 0s
 22200K .......... .......... .......... .......... .......... 48%  321M 0s
 22250K .......... .......... .......... .......... .......... 48%  454M 0s
 22300K .......... .......... .......... .......... .......... 48%  297M 0s
 22350K .......... .......... .......... .......... .......... 48%  227M 0s
 22400K .......... .......... .......... .......... .......... 49%  458M 0s
 22450K .......... .......... .......... .......... .......... 49%  262M 0s
 22500K .......... .......... .......... .......... .......... 49%  310M 0s
 22550K .......... .......... .......... .......... .......... 49%  468M 0s
 22600K .......... .......... .......... .......... .......... 49%  295M 0s
 22650K .......... .......... .......... .......... .......... 49%  271M 0s
 22700K .......... .......... .......... .......... .......... 49%  443M 0s
 22750K .......... .......... .......... .......... .......... 49%  210M 0s
 22800K .......... .......... .......... .......... .......... 49%  464M 0s
 22850K .......... .......... .......... .......... .......... 50%  250M 0s
 22900K .......... .......... .......... .......... .......... 50%  309M 0s
 22950K .......... .......... .......... .......... .......... 50%  315M 0s
 23000K .......... .......... .......... .......... .......... 50%  286M 0s
 23050K .......... .......... .......... .......... .......... 50%  328M 0s
 23100K .......... .......... .......... .......... .......... 50%  365M 0s
 23150K .......... .......... .......... .......... .......... 50%  260M 0s
 23200K .......... .......... .......... .......... .......... 50%  322M 0s
 23250K .......... .......... .......... .......... .......... 50%  446M 0s
 23300K .......... .......... .......... .......... .......... 51%  278M 0s
 23350K .......... .......... .......... .......... .......... 51%  296M 0s
 23400K .......... .......... .......... .......... .......... 51%  299M 0s
 23450K .......... .......... .......... .......... .......... 51%  469M 0s
 23500K .......... .......... .......... .......... .......... 51%  287M 0s
 23550K .......... .......... .......... .......... .......... 51%  242M 0s
 23600K .......... .......... .......... .......... .......... 51%  261M 0s
 23650K .......... .......... .......... .......... .......... 51%  297M 0s
 23700K .......... .......... .......... .......... .......... 51%  360M 0s
 23750K .......... .......... .......... .......... .......... 52%  318M 0s
 23800K .......... .......... .......... .......... .......... 52%  315M 0s
 23850K .......... .......... .......... .......... .......... 52%  317M 0s
 23900K .......... .......... .......... .......... .......... 52%  435M 0s
 23950K .......... .......... .......... .......... .......... 52%  243M 0s
 24000K .......... .......... .......... .......... .......... 52%  304M 0s
 24050K .......... .......... .......... .......... .......... 52%  296M 0s
 24100K .......... .......... .......... .......... .......... 52%  295M 0s
 24150K .......... .......... .......... .......... .......... 52%  455M 0s
 24200K .......... .......... .......... .......... .......... 53%  224M 0s
 24250K .......... .......... .......... .......... .......... 53%  386M 0s
 24300K .......... .......... .......... .......... .......... 53%  469M 0s
 24350K .......... .......... .......... .......... .......... 53%  214M 0s
 24400K .......... .......... .......... .......... .......... 53%  231M 0s
 24450K .......... .......... .......... .......... .......... 53%  465M 0s
 24500K .......... .......... .......... .......... .......... 53%  453M 0s
 24550K .......... .......... .......... .......... .......... 53%  470M 0s
 24600K .......... .......... .......... .......... .......... 53%  217M 0s
 24650K .......... .......... .......... .......... .......... 54%  440M 0s
 24700K .......... .......... .......... .......... .......... 54%  250M 0s
 24750K .......... .......... .......... .......... .......... 54%  333M 0s
 24800K .......... .......... .......... .......... .......... 54%  259M 0s
 24850K .......... .......... .......... .......... .......... 54%  427M 0s
 24900K .......... .......... .......... .......... .......... 54%  408M 0s
 24950K .......... .......... .......... .......... .......... 54%  459M 0s
 25000K .......... .......... .......... .......... .......... 54%  218M 0s
 25050K .......... .......... .......... .......... .......... 54%  434M 0s
 25100K .......... .......... .......... .......... .......... 54%  472M 0s
 25150K .......... .......... .......... .......... .......... 55%  198M 0s
 25200K .......... .......... .......... .......... .......... 55%  319M 0s
 25250K .......... .......... .......... .......... .......... 55%  402M 0s
 25300K .......... .......... .......... .......... .......... 55%  252M 0s
 25350K .......... .......... .......... .......... .......... 55%  473M 0s
 25400K .......... .......... .......... .......... .......... 55%  277M 0s
 25450K .......... .......... .......... .......... .......... 55%  277M 0s
 25500K .......... .......... .......... .......... .......... 55%  305M 0s
 25550K .......... .......... .......... .......... .......... 55%  240M 0s
 25600K .......... .......... .......... .......... .......... 56%  355M 0s
 25650K .......... .......... .......... .......... .......... 56%  339M 0s
 25700K .......... .......... .......... .......... .......... 56%  459M 0s
 25750K .......... .......... .......... .......... .......... 56%  313M 0s
 25800K .......... .......... .......... .......... .......... 56%  293M 0s
 25850K .......... .......... .......... .......... .......... 56%  319M 0s
 25900K .......... .......... .......... .......... .......... 56%  280M 0s
 25950K .......... .......... .......... .......... .......... 56%  262M 0s
 26000K .......... .......... .......... .......... .......... 56%  309M 0s
 26050K .......... .......... .......... .......... .......... 57%  315M 0s
 26100K .......... .......... .......... .......... .......... 57%  312M 0s
 26150K .......... .......... .......... .......... .......... 57%  448M 0s
 26200K .......... .......... .......... .......... .......... 57%  248M 0s
 26250K .......... .......... .......... .......... .......... 57%  320M 0s
 26300K .......... .......... .......... .......... .......... 57%  282M 0s
 26350K .......... .......... .......... .......... .......... 57%  338M 0s
 26400K .......... .......... .......... .......... .......... 57%  458M 0s
 26450K .......... .......... .......... .......... .......... 57%  457M 0s
 26500K .......... .......... .......... .......... .......... 58%  181M 0s
 26550K .......... .......... .......... .......... .......... 58%  287M 0s
 26600K .......... .......... .......... .......... .......... 58%  386M 0s
 26650K .......... .......... .......... .......... .......... 58%  318M 0s
 26700K .......... .......... .......... .......... .......... 58%  322M 0s
 26750K .......... .......... .......... .......... .......... 58%  258M 0s
 26800K .......... .......... .......... .......... .......... 58%  248M 0s
 26850K .......... .......... .......... .......... .......... 58%  299M 0s
 26900K .......... .......... .......... .......... .......... 58%  299M 0s
 26950K .......... .......... .......... .......... .......... 59%  463M 0s
 27000K .......... .......... .......... .......... .......... 59%  278M 0s
 27050K .......... .......... .......... .......... .......... 59%  323M 0s
 27100K .......... .......... .......... .......... .......... 59%  429M 0s
 27150K .......... .......... .......... .......... .......... 59%  253M 0s
 27200K .......... .......... .......... .......... .......... 59%  338M 0s
 27250K .......... .......... .......... .......... .......... 59%  439M 0s
 27300K .......... .......... .......... .......... .......... 59%  282M 0s
 27350K .......... .......... .......... .......... .......... 59%  322M 0s
 27400K .......... .......... .......... .......... .......... 60%  397M 0s
 27450K .......... .......... .......... .......... .......... 60%  351M 0s
 27500K .......... .......... .......... .......... .......... 60%  396M 0s
 27550K .......... .......... .......... .......... .......... 60%  233M 0s
 27600K .......... .......... .......... .......... .......... 60%  462M 0s
 27650K .......... .......... .......... .......... .......... 60%  250M 0s
 27700K .......... .......... .......... .......... .......... 60%  470M 0s
 27750K .......... .......... .......... .......... .......... 60%  462M 0s
 27800K .......... .......... .......... .......... .......... 60%  232M 0s
 27850K .......... .......... .......... .......... .......... 61%  439M 0s
 27900K .......... .......... .......... .......... .......... 61%  230M 0s
 27950K .......... .......... .......... .......... .......... 61%  242M 0s
 28000K .......... .......... .......... .......... .......... 61%  466M 0s
 28050K .......... .......... .......... .......... .......... 61%  262M 0s
 28100K .......... .......... .......... .......... .......... 61%  278M 0s
 28150K .......... .......... .......... .......... .......... 61%  446M 0s
 28200K .......... .......... .......... .......... .......... 61%  475M 0s
 28250K .......... .......... .......... .......... .......... 61%  247M 0s
 28300K .......... .......... .......... .......... .......... 61%  302M 0s
 28350K .......... .......... .......... .......... .......... 62%  342M 0s
 28400K .......... .......... .......... .......... .......... 62%  246M 0s
 28450K .......... .......... .......... .......... .......... 62%  388M 0s
 28500K .......... .......... .......... .......... .......... 62%  253M 0s
 28550K .......... .......... .......... .......... .......... 62%  298M 0s
 28600K .......... .......... .......... .......... .......... 62%  305M 0s
 28650K .......... .......... .......... .......... .......... 62%  302M 0s
 28700K .......... .......... .......... .......... .......... 62%  456M 0s
 28750K .......... .......... .......... .......... .......... 62%  347M 0s
 28800K .......... .......... .......... .......... .......... 63%  189M 0s
 28850K .......... .......... .......... .......... .......... 63%  314M 0s
 28900K .......... .......... .......... .......... .......... 63%  291M 0s
 28950K .......... .......... .......... .......... .......... 63%  291M 0s
 29000K .......... .......... .......... .......... .......... 63%  457M 0s
 29050K .......... .......... .......... .......... .......... 63%  323M 0s
 29100K .......... .......... .......... .......... .......... 63%  282M 0s
 29150K .......... .......... .......... .......... .......... 63%  247M 0s
 29200K .......... .......... .......... .......... .......... 63%  314M 0s
 29250K .......... .......... .......... .......... .......... 64%  351M 0s
 29300K .......... .......... .......... .......... .......... 64%  301M 0s
 29350K .......... .......... .......... .......... .......... 64%  446M 0s
 29400K .......... .......... .......... .......... .......... 64%  412M 0s
 29450K .......... .......... .......... .......... .......... 64%  232M 0s
 29500K .......... .......... .......... .......... .......... 64%  284M 0s
 29550K .......... .......... .......... .......... .......... 64%  352M 0s
 29600K .......... .......... .......... .......... .......... 64%  470M 0s
 29650K .......... .......... .......... .......... .......... 64%  252M 0s
 29700K .......... .......... .......... .......... .......... 65%  438M 0s
 29750K .......... .......... .......... .......... .......... 65%  407M 0s
 29800K .......... .......... .......... .......... .......... 65%  250M 0s
 29850K .......... .......... .......... .......... .......... 65%  314M 0s
 29900K .......... .......... .......... .......... .......... 65%  296M 0s
 29950K .......... .......... .......... .......... .......... 65%  341M 0s
 30000K .......... .......... .......... .......... .......... 65%  267M 0s
 30050K .......... .......... .......... .......... .......... 65%  292M 0s
 30100K .......... .......... .......... .......... .......... 65%  295M 0s
 30150K .......... .......... .......... .......... .......... 66%  318M 0s
 30200K .......... .......... .......... .......... .......... 66%  441M 0s
 30250K .......... .......... .......... .......... .......... 66%  317M 0s
 30300K .......... .......... .......... .......... .......... 66%  325M 0s
 30350K .......... .......... .......... .......... .......... 66%  232M 0s
 30400K .......... .......... .......... .......... .......... 66%  461M 0s
 30450K .......... .......... .......... .......... .......... 66%  434M 0s
 30500K .......... .......... .......... .......... .......... 66%  478M 0s
 30550K .......... .......... .......... .......... .......... 66%  484M 0s
 30600K .......... .......... .......... .......... .......... 67%  465M 0s
 30650K .......... .......... .......... .......... .......... 67%  378M 0s
 30700K .......... .......... .......... .......... .......... 67%  319M 0s
 30750K .......... .......... .......... .......... .......... 67%  239M 0s
 30800K .......... .......... .......... .......... .......... 67%  456M 0s
 30850K .......... .......... .......... .......... .......... 67%  253M 0s
 30900K .......... .......... .......... .......... .......... 67%  450M 0s
 30950K .......... .......... .......... .......... .......... 67%  466M 0s
 31000K .......... .......... .......... .......... .......... 67%  249M 0s
 31050K .......... .......... .......... .......... .......... 68%  477M 0s
 31100K .......... .......... .......... .......... .......... 68%  449M 0s
 31150K .......... .......... .......... .......... .......... 68%  197M 0s
 31200K .......... .......... .......... .......... .......... 68%  454M 0s
 31250K .......... .......... .......... .......... .......... 68%  240M 0s
 31300K .......... .......... .......... .......... .......... 68%  486M 0s
 31350K .......... .......... .......... .......... .......... 68%  260M 0s
 31400K .......... .......... .......... .......... .......... 68%  408M 0s
 31450K .......... .......... .......... .......... .......... 68%  316M 0s
 31500K .......... .......... .......... .......... .......... 68%  402M 0s
 31550K .......... .......... .......... .......... .......... 69%  252M 0s
 31600K .......... .......... .......... .......... .......... 69%  470M 0s
 31650K .......... .......... .......... .......... .......... 69%  277M 0s
 31700K .......... .......... .......... .......... .......... 69%  440M 0s
 31750K .......... .......... .......... .......... .......... 69%  475M 0s
 31800K .......... .......... .......... .......... .......... 69%  208M 0s
 31850K .......... .......... .......... .......... .......... 69%  428M 0s
 31900K .......... .......... .......... .......... .......... 69%  481M 0s
 31950K .......... .......... .......... .......... .......... 69%  344M 0s
 32000K .......... .......... .......... .......... .......... 70%  216M 0s
 32050K .......... .......... .......... .......... .......... 70%  465M 0s
 32100K .......... .......... .......... .......... .......... 70%  214M 0s
 32150K .......... .......... .......... .......... .......... 70%  415M 0s
 32200K .......... .......... .......... .......... .......... 70%  453M 0s
 32250K .......... .......... .......... .......... .......... 70%  483M 0s
 32300K .......... .......... .......... .......... .......... 70%  230M 0s
 32350K .......... .......... .......... .......... .......... 70%  328M 0s
 32400K .......... .......... .......... .......... .......... 70%  247M 0s
 32450K .......... .......... .......... .......... .......... 71%  360M 0s
 32500K .......... .......... .......... .......... .......... 71%  252M 0s
 32550K .......... .......... .......... .......... .......... 71%  312M 0s
 32600K .......... .......... .......... .......... .......... 71%  408M 0s
 32650K .......... .......... .......... .......... .......... 71%  384M 0s
 32700K .......... .......... .......... .......... .......... 71%  318M 0s
 32750K .......... .......... .......... .......... .......... 71%  228M 0s
 32800K .......... .......... .......... .......... .......... 71%  459M 0s
 32850K .......... .......... .......... .......... .......... 71%  262M 0s
 32900K .......... .......... .......... .......... .......... 72%  469M 0s
 32950K .......... .......... .......... .......... .......... 72%  449M 0s
 33000K .......... .......... .......... .......... .......... 72%  248M 0s
 33050K .......... .......... .......... .......... .......... 72%  459M 0s
 33100K .......... .......... .......... .......... .......... 72%  465M 0s
 33150K .......... .......... .......... .......... .......... 72%  181M 0s
 33200K .......... .......... .......... .......... .......... 72%  441M 0s
 33250K .......... .......... .......... .......... .......... 72%  467M 0s
 33300K .......... .......... .......... .......... .......... 72%  245M 0s
 33350K .......... .......... .......... .......... .......... 73%  479M 0s
 33400K .......... .......... .......... .......... .......... 73%  471M 0s
 33450K .......... .......... .......... .......... .......... 73%  191M 0s
 33500K .......... .......... .......... .......... .......... 73%  406M 0s
 33550K .......... .......... .......... .......... .......... 73%  310M 0s
 33600K .......... .......... .......... .......... .......... 73%  333M 0s
 33650K .......... .......... .......... .......... .......... 73%  294M 0s
 33700K .......... .......... .......... .......... .......... 73%  411M 0s
 33750K .......... .......... .......... .......... .......... 73%  483M 0s
 33800K .......... .......... .......... .......... .......... 74%  234M 0s
 33850K .......... .......... .......... .......... .......... 74%  471M 0s
 33900K .......... .......... .......... .......... .......... 74%  252M 0s
 33950K .......... .......... .......... .......... .......... 74%  348M 0s
 34000K .......... .......... .......... .......... .......... 74%  255M 0s
 34050K .......... .......... .......... .......... .......... 74%  463M 0s
 34100K .......... .......... .......... .......... .......... 74%  238M 0s
 34150K .......... .......... .......... .......... .......... 74%  471M 0s
 34200K .......... .......... .......... .......... .......... 74%  244M 0s
 34250K .......... .......... .......... .......... .......... 75%  482M 0s
 34300K .......... .......... .......... .......... .......... 75%  464M 0s
 34350K .......... .......... .......... .......... .......... 75%  202M 0s
 34400K .......... .......... .......... .......... .......... 75%  461M 0s
 34450K .......... .......... .......... .......... .......... 75%  408M 0s
 34500K .......... .......... .......... .......... .......... 75%  217M 0s
 34550K .......... .......... .......... .......... .......... 75%  204M 0s
 34600K .......... .......... .......... .......... .......... 75%  482M 0s
 34650K .......... .......... .......... .......... .......... 75%  306M 0s
 34700K .......... .......... .......... .......... .......... 75%  303M 0s
 34750K .......... .......... .......... .......... .......... 76%  312M 0s
 34800K .......... .......... .......... .......... .......... 76%  245M 0s
 34850K .......... .......... .......... .......... .......... 76%  368M 0s
 34900K .......... .......... .......... .......... .......... 76%  329M 0s
 34950K .......... .......... .......... .......... .......... 76%  450M 0s
 35000K .......... .......... .......... .......... .......... 76%  470M 0s
 35050K .......... .......... .......... .......... .......... 76%  261M 0s
 35100K .......... .......... .......... .......... .......... 76%  406M 0s
 35150K .......... .......... .......... .......... .......... 76%  209M 0s
 35200K .......... .......... .......... .......... .......... 77%  314M 0s
 35250K .......... .......... .......... .......... .......... 77%  469M 0s
 35300K .......... .......... .......... .......... .......... 77%  251M 0s
 35350K .......... .......... .......... .......... .......... 77%  461M 0s
 35400K .......... .......... .......... .......... .......... 77%  462M 0s
 35450K .......... .......... .......... .......... .......... 77%  219M 0s
 35500K .......... .......... .......... .......... .......... 77%  482M 0s
 35550K .......... .......... .......... .......... .......... 77%  192M 0s
 35600K .......... .......... .......... .......... .......... 77%  431M 0s
 35650K .......... .......... .......... .......... .......... 78%  461M 0s
 35700K .......... .......... .......... .......... .......... 78%  224M 0s
 35750K .......... .......... .......... .......... .......... 78%  437M 0s
 35800K .......... .......... .......... .......... .......... 78%  225M 0s
 35850K .......... .......... .......... .......... .......... 78%  452M 0s
 35900K .......... .......... .......... .......... .......... 78%  450M 0s
 35950K .......... .......... .......... .......... .......... 78%  189M 0s
 36000K .......... .......... .......... .......... .......... 78%  464M 0s
 36050K .......... .......... .......... .......... .......... 78%  402M 0s
 36100K .......... .......... .......... .......... .......... 79%  201M 0s
 36150K .......... .......... .......... .......... .......... 79%  393M 0s
 36200K .......... .......... .......... .......... .......... 79%  360M 0s
 36250K .......... .......... .......... .......... .......... 79%  334M 0s
 36300K .......... .......... .......... .......... .......... 79%  296M 0s
 36350K .......... .......... .......... .......... .......... 79%  352M 0s
 36400K .......... .......... .......... .......... .......... 79%  239M 0s
 36450K .......... .......... .......... .......... .......... 79%  458M 0s
 36500K .......... .......... .......... .......... .......... 79%  260M 0s
 36550K .......... .......... .......... .......... .......... 80%  416M 0s
 36600K .......... .......... .......... .......... .......... 80%  469M 0s
 36650K .......... .......... .......... .......... .......... 80%  199M 0s
 36700K .......... .......... .......... .......... .......... 80%  328M 0s
 36750K .......... .......... .......... .......... .......... 80%  313M 0s
 36800K .......... .......... .......... .......... .......... 80%  321M 0s
 36850K .......... .......... .......... .......... .......... 80%  314M 0s
 36900K .......... .......... .......... .......... .......... 80%  299M 0s
 36950K .......... .......... .......... .......... .......... 80%  312M 0s
 37000K .......... .......... .......... .......... .......... 81%  466M 0s
 37050K .......... .......... .......... .......... .......... 81%  231M 0s
 37100K .......... .......... .......... .......... .......... 81%  322M 0s
 37150K .......... .......... .......... .......... .......... 81%  338M 0s
 37200K .......... .......... .......... .......... .......... 81%  431M 0s
 37250K .......... .......... .......... .......... .......... 81%  328M 0s
 37300K .......... .......... .......... .......... .......... 81%  430M 0s
 37350K .......... .......... .......... .......... .......... 81%  239M 0s
 37400K .......... .......... .......... .......... .......... 81%  326M 0s
 37450K .......... .......... .......... .......... .......... 81%  456M 0s
 37500K .......... .......... .......... .......... .......... 82%  296M 0s
 37550K .......... .......... .......... .......... .......... 82%  349M 0s
 37600K .......... .......... .......... .......... .......... 82%  253M 0s
 37650K .......... .......... .......... .......... .......... 82%  456M 0s
 37700K .......... .......... .......... .......... .......... 82%  226M 0s
 37750K .......... .......... .......... .......... .......... 82%  462M 0s
 37800K .......... .......... .......... .......... .......... 82%  465M 0s
 37850K .......... .......... .......... .......... .......... 82%  480M 0s
 37900K .......... .......... .......... .......... .......... 82%  181M 0s
 37950K .......... .......... .......... .......... .......... 83%  349M 0s
 38000K .......... .......... .......... .......... .......... 83%  255M 0s
 38050K .......... .......... .......... .......... .......... 83%  442M 0s
 38100K .......... .......... .......... .......... .......... 83%  286M 0s
 38150K .......... .......... .......... .......... .......... 83%  278M 0s
 38200K .......... .......... .......... .......... .......... 83%  324M 0s
 38250K .......... .......... .......... .......... .......... 83%  411M 0s
 38300K .......... .......... .......... .......... .......... 83%  234M 0s
 38350K .......... .......... .......... .......... .......... 83%  251M 0s
 38400K .......... .......... .......... .......... .......... 84%  415M 0s
 38450K .......... .......... .......... .......... .......... 84%  475M 0s
 38500K .......... .......... .......... .......... .......... 84%  255M 0s
 38550K .......... .......... .......... .......... .......... 84%  300M 0s
 38600K .......... .......... .......... .......... .......... 84%  297M 0s
 38650K .......... .......... .......... .......... .......... 84%  275M 0s
 38700K .......... .......... .......... .......... .......... 84%  434M 0s
 38750K .......... .......... .......... .......... .......... 84%  346M 0s
 38800K .......... .......... .......... .......... .......... 84%  265M 0s
 38850K .......... .......... .......... .......... .......... 85%  258M 0s
 38900K .......... .......... .......... .......... .......... 85%  317M 0s
 38950K .......... .......... .......... .......... .......... 85%  259M 0s
 39000K .......... .......... .......... .......... .......... 85%  415M 0s
 39050K .......... .......... .......... .......... .......... 85%  465M 0s
 39100K .......... .......... .......... .......... .......... 85%  248M 0s
 39150K .......... .......... .......... .......... .......... 85%  150M 0s
 39200K .......... .......... .......... .......... .......... 85%  222M 0s
 39250K .......... .......... .......... .......... .......... 85%  324M 0s
 39300K .......... .......... .......... .......... .......... 86%  398M 0s
 39350K .......... .......... .......... .......... .......... 86%  314M 0s
 39400K .......... .......... .......... .......... .......... 86%  310M 0s
 39450K .......... .......... .......... .......... .......... 86%  476M 0s
 39500K .......... .......... .......... .......... .......... 86%  240M 0s
 39550K .......... .......... .......... .......... .......... 86%  235M 0s
 39600K .......... .......... .......... .......... .......... 86%  322M 0s
 39650K .......... .......... .......... .......... .......... 86%  419M 0s
 39700K .......... .......... .......... .......... .......... 86%  482M 0s
 39750K .......... .......... .......... .......... .......... 87%  245M 0s
 39800K .......... .......... .......... .......... .......... 87%  304M 0s
 39850K .......... .......... .......... .......... .......... 87%  460M 0s
 39900K .......... .......... .......... .......... .......... 87%  229M 0s
 39950K .......... .......... .......... .......... .......... 87%  229M 0s
 40000K .......... .......... .......... .......... .......... 87%  321M 0s
 40050K .......... .......... .......... .......... .......... 87%  443M 0s
 40100K .......... .......... .......... .......... .......... 87%  461M 0s
 40150K .......... .......... .......... .......... .......... 87%  294M 0s
 40200K .......... .......... .......... .......... .......... 88%  415M 0s
 40250K .......... .......... .......... .......... .......... 88%  438M 0s
 40300K .......... .......... .......... .......... .......... 88%  465M 0s
 40350K .......... .......... .......... .......... .......... 88%  341M 0s
 40400K .......... .......... .......... .......... .......... 88%  460M 0s
 40450K .......... .......... .......... .......... .......... 88%  216M 0s
 40500K .......... .......... .......... .......... .......... 88%  407M 0s
 40550K .......... .......... .......... .......... .......... 88%  469M 0s
 40600K .......... .......... .......... .......... .......... 88%  406M 0s
 40650K .......... .......... .......... .......... .......... 88%  455M 0s
 40700K .......... .......... .......... .......... .......... 89%  474M 0s
 40750K .......... .......... .......... .......... .......... 89%  201M 0s
 40800K .......... .......... .......... .......... .......... 89%  473M 0s
 40850K .......... .......... .......... .......... .......... 89%  461M 0s
 40900K .......... .......... .......... .......... .......... 89%  468M 0s
 40950K .......... .......... .......... .......... .......... 89%  201M 0s
 41000K .......... .......... .......... .......... .......... 89%  445M 0s
 41050K .......... .......... .......... .......... .......... 89%  466M 0s
 41100K .......... .......... .......... .......... .......... 89%  218M 0s
 41150K .......... .......... .......... .......... .......... 90%  340M 0s
 41200K .......... .......... .......... .......... .......... 90%  450M 0s
 41250K .......... .......... .......... .......... .......... 90%  441M 0s
 41300K .......... .......... .......... .......... .......... 90%  198M 0s
 41350K .......... .......... .......... .......... .......... 90%  445M 0s
 41400K .......... .......... .......... .......... .......... 90%  477M 0s
 41450K .......... .......... .......... .......... .......... 90%  226M 0s
 41500K .......... .......... .......... .......... .......... 90%  408M 0s
 41550K .......... .......... .......... .......... .......... 90%  349M 0s
 41600K .......... .......... .......... .......... .......... 91%  203M 0s
 41650K .......... .......... .......... .......... .......... 91%  438M 0s
 41700K .......... .......... .......... .......... .......... 91%  471M 0s
 41750K .......... .......... .......... .......... .......... 91%  451M 0s
 41800K .......... .......... .......... .......... .......... 91%  215M 0s
 41850K .......... .......... .......... .......... .......... 91%  451M 0s
 41900K .......... .......... .......... .......... .......... 91%  484M 0s
 41950K .......... .......... .......... .......... .......... 91%  187M 0s
 42000K .......... .......... .......... .......... .......... 91%  393M 0s
 42050K .......... .......... .......... .......... .......... 92%  473M 0s
 42100K .......... .......... .......... .......... .......... 92%  223M 0s
 42150K .......... .......... .......... .......... .......... 92%  429M 0s
 42200K .......... .......... .......... .......... .......... 92%  480M 0s
 42250K .......... .......... .......... .......... .......... 92%  442M 0s
 42300K .......... .......... .......... .......... .......... 92%  187M 0s
 42350K .......... .......... .......... .......... .......... 92%  350M 0s
 42400K .......... .......... .......... .......... .......... 92%  233M 0s
 42450K .......... .......... .......... .......... .......... 92%  457M 0s
 42500K .......... .......... .......... .......... .......... 93%  309M 0s
 42550K .......... .......... .......... .......... .......... 93%  472M 0s
 42600K .......... .......... .......... .......... .......... 93%  231M 0s
 42650K .......... .......... .......... .......... .......... 93%  312M 0s
 42700K .......... .......... .......... .......... .......... 93%  322M 0s
 42750K .......... .......... .......... .......... .......... 93%  330M 0s
 42800K .......... .......... .......... .......... .......... 93%  476M 0s
 42850K .......... .......... .......... .......... .......... 93%  250M 0s
 42900K .......... .......... .......... .......... .......... 93%  455M 0s
 42950K .......... .......... .......... .......... .......... 94%  238M 0s
 43000K .......... .......... .......... .......... .......... 94%  439M 0s
 43050K .......... .......... .......... .......... .......... 94%  321M 0s
 43100K .......... .......... .......... .......... .......... 94%  313M 0s
 43150K .......... .......... .......... .......... .......... 94%  348M 0s
 43200K .......... .......... .......... .......... .......... 94%  488M 0s
 43250K .......... .......... .......... .......... .......... 94%  438M 0s
 43300K .......... .......... .......... .......... .......... 94%  424M 0s
 43350K .......... .......... .......... .......... .......... 94%  444M 0s
 43400K .......... .......... .......... .......... .......... 95%  397M 0s
 43450K .......... .......... .......... .......... .......... 95%  363M 0s
 43500K .......... .......... .......... .......... .......... 95%  319M 0s
 43550K .......... .......... .......... .......... .......... 95%  241M 0s
 43600K .......... .......... .......... .......... .......... 95%  455M 0s
 43650K .......... .......... .......... .......... .......... 95%  246M 0s
 43700K .......... .......... .......... .......... .......... 95%  467M 0s
 43750K .......... .......... .......... .......... .......... 95%  430M 0s
 43800K .......... .......... .......... .......... .......... 95%  242M 0s
 43850K .......... .......... .......... .......... .......... 95%  302M 0s
 43900K .......... .......... .......... .......... .......... 96%  479M 0s
 43950K .......... .......... .......... .......... .......... 96%  350M 0s
 44000K .......... .......... .......... .......... .......... 96%  230M 0s
 44050K .......... .......... .......... .......... .......... 96%  248M 0s
 44100K .......... .......... .......... .......... .......... 96%  426M 0s
 44150K .......... .......... .......... .......... .......... 96%  475M 0s
 44200K .......... .......... .......... .......... .......... 96%  482M 0s
 44250K .......... .......... .......... .......... .......... 96%  221M 0s
 44300K .......... .......... .......... .......... .......... 96%  428M 0s
 44350K .......... .......... .......... .......... .......... 97%  313M 0s
 44400K .......... .......... .......... .......... .......... 97%  472M 0s
 44450K .......... .......... .......... .......... .......... 97%  213M 0s
 44500K .......... .......... .......... .......... .......... 97%  463M 0s
 44550K .......... .......... .......... .......... .......... 97%  446M 0s
 44600K .......... .......... .......... .......... .......... 97%  211M 0s
 44650K .......... .......... .......... .......... .......... 97%  466M 0s
 44700K .......... .......... .......... .......... .......... 97%  415M 0s
 44750K .......... .......... .......... .......... .......... 97%  193M 0s
 44800K .......... .......... .......... .......... .......... 98%  463M 0s
 44850K .......... .......... .......... .......... .......... 98%  461M 0s
 44900K .......... .......... .......... .......... .......... 98%  483M 0s
 44950K .......... .......... .......... .......... .......... 98%  445M 0s
 45000K .......... .......... .......... .......... .......... 98%  479M 0s
 45050K .......... .......... .......... .......... .......... 98%  466M 0s
 45100K .......... .......... .......... .......... .......... 98%  404M 0s
 45150K .......... .......... .......... .......... .......... 98%  352M 0s
 45200K .......... .......... .......... .......... .......... 98%  466M 0s
 45250K .......... .......... .......... .......... .......... 99%  474M 0s
 45300K .......... .......... .......... .......... .......... 99%  464M 0s
 45350K .......... .......... .......... .......... .......... 99%  459M 0s
 45400K .......... .......... .......... .......... .......... 99%  480M 0s
 45450K .......... .......... .......... .......... .......... 99%  460M 0s
 45500K .......... .......... .......... .......... .......... 99%  478M 0s
 45550K .......... .......... .......... .......... .......... 99%  320M 0s
 45600K .......... .......... .......... .......... .......... 99%  462M 0s
 45650K .......... .......... .......... .......... .......... 99%  486M 0s
 45700K .......... .......... .......... ..                   100%  482M=0.2s

2025-07-01 13:28:49 (240 MB/s) - ‘resnet18-f37072fd.pth’ saved [46830571/46830571]
Hide/Show the code
learn <- cnn_learner(dls = dls,
                     arch = resnet18(),
                     path = ".",
                     metrics = list(accuracy, error_rate))

Now we are ready to train our model. Again, for the sake of illustration, we use only 2 epochs here, but used 20 epochs to get the full results presented in the main text. With all pictures and a resnet50, it took 75 minutes per epoch approximatively on a Mac with a 2.4Ghz processor and 64Go memory, and less than half an hour on a machine with GPU. On this reduced dataset, it took a bit more than a minute per epoch on the same Mac. Note that we save the model after each epoch for later use.

Hide/Show the code
one_cycle <- learn %>%
  fit_one_cycle(2, cbs = SaveModelCallback(every_epoch = TRUE,
                                           fname = 'model'))

epoch   train_loss   valid_loss   accuracy   error_rate   time  
------  -----------  -----------  ---------  -----------  ------


Epoch 1/2 :                                                                         

Epoch 1/2 :                                                                         


Epoch 1/2 :                                                                    

Epoch 1/2 :                                                                    
0       2.599615     0.903939     0.739583   0.260417     00:38 


Epoch 2/2 :                                                                         

Epoch 2/2 :                                                                         


Epoch 2/2 :                                                                    

Epoch 2/2 :                                                                    
1       1.712405     0.817533     0.770833   0.229167     00:38 
Hide/Show the code
one_cycle
  epoch train_loss valid_loss  accuracy error_rate
1     0   2.599615  0.9039392 0.7395833  0.2604167
2     1   1.712405  0.8175330 0.7708333  0.2291667

We may dig a bit deeper in training performances by loading the best model, here model_1.pth, and display some metrics for each species.

Hide/Show the code
learn$load("model_1")
Sequential(
  (0): Sequential(
    (0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
    (1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (2): ReLU(inplace=True)
    (3): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
    (4): Sequential(
      (0): BasicBlock(
        (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
      (1): BasicBlock(
        (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (5): Sequential(
      (0): BasicBlock(
        (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (downsample): Sequential(
          (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
          (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (1): BasicBlock(
        (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (6): Sequential(
      (0): BasicBlock(
        (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (downsample): Sequential(
          (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (1): BasicBlock(
        (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (7): Sequential(
      (0): BasicBlock(
        (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (downsample): Sequential(
          (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (1): BasicBlock(
        (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (relu): ReLU(inplace=True)
        (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
  )
  (1): Sequential(
    (0): AdaptiveConcatPool2d(
      (ap): AdaptiveAvgPool2d(output_size=1)
      (mp): AdaptiveMaxPool2d(output_size=1)
    )
    (1): fastai.layers.Flatten(full=False)
    (2): BatchNorm1d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (3): Dropout(p=0.25, inplace=False)
    (4): Linear(in_features=1024, out_features=512, bias=False)
    (5): ReLU(inplace=True)
    (6): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (7): Dropout(p=0.5, inplace=False)
    (8): Linear(in_features=512, out_features=12, bias=False)
  )
)
 signature: (*args, **kwargs)
Hide/Show the code
interp <- ClassificationInterpretation_from_learner(learn)
Hide/Show the code
interp$print_classification_report()


                                                                    

                                                                    
              precision    recall  f1-score   support

   blaireaux       0.00      0.00      0.00         2
     chamois       0.00      0.00      0.00         3
    chasseur       0.50      0.25      0.33         4
        chat       0.00      0.00      0.00         2
   chevreuil       0.33      0.33      0.33         3
       chien       0.50      0.33      0.40         3
      humain       0.88      0.79      0.84        29
      lievre       0.00      0.00      0.00         1
        lynx       0.75      1.00      0.86         3
      renard       0.67      1.00      0.80        12
   sangliers       0.60      0.86      0.71         7
    vehicule       0.87      1.00      0.93        27

    accuracy                           0.77        96
   macro avg       0.43      0.46      0.43        96
weighted avg       0.71      0.77      0.73        96

We may extract the categories that get the most confused.

Hide/Show the code
interp %>% most_confused()
          V1        V2 V3
1     humain  vehicule  4
2   chasseur    humain  3
3    chamois chevreuil  2
4  blaireaux    renard  1
5  blaireaux sangliers  1
6    chamois sangliers  1
7       chat      lynx  1
8       chat    renard  1
9  chevreuil    renard  1
10 chevreuil sangliers  1
11     chien    renard  1
12     chien sangliers  1
13    humain  chasseur  1
14    humain     chien  1
15    lievre    renard  1
16 sangliers    renard  1

6.3 Transferability

In this section, we show how to use our freshly trained model to label images that were taken in another study site in the Ain county, and not used to train our model. First, we get the path to the images.

Hide/Show the code
fls <- list.files(path = "pix/pixAin",
                  full.names = TRUE,
                  recursive = TRUE)

Then we carry out prediction, and compare to the truth.

Hide/Show the code
predicted <- character(3)
categories <- interp$vocab %>% as.character() %>% 
  str_replace_all("[[:punct:]]", " ") %>%
  str_trim() %>%
  str_split("   ") %>%
  unlist()
for (i in 1:length(fls)){
  result <- learn %>% predict(fls[i]) # make prediction
  result[[3]] %>% as.character() %>% 
    str_extract("\\d+") %>%
    as.integer() -> index # extract relevant info
  predicted[i] <- categories[index + 1] # match it with categories
}
data.frame(truth = c("lynx", "roe deer", "wild boar"),
           prediction = predicted) %>%
  kable() %>%
  kable_styling()
Table 6: Comparison of the predictions vs. ground truth.
truth prediction
lynx renard
roe deer lynx
wild boar sangliers

References

Allaire, JJ, Kevin Ushey, Yuan Tang, and Dirk Eddelbuettel. 2017. Reticulate: R Interface to Python. https://github.com/rstudio/reticulate.
Baraniuk, Richard, David Donoho, and Matan Gavish. 2020. “The Science of Deep Learning.” Proceedings of the National Academy of Sciences 117 (48): 30029–32. https://doi.org/10.1073/pnas.2020596117.
Beery, Sara, Grant van Horn, and Pietro Perona. 2018. “Recognition in Terra Incognita.” arXiv:1807.04975. http://arxiv.org/abs/1807.04975.
Botella, Christophe, Alexis Joly, Pierre Bonnet, Pascal Monestiez, and François Munoz. 2018. “Species Distribution Modeling Based on the Automated Identification of Citizen Observations.” Applications in Plant Sciences 6 (2): e1029.
Breitenmoser, Urs. 1998. “Large Predators in the Alps: The Fall and Rise of Man’s Competitors.” Biological Conservation, Conservation Biology and Biodiversity Strategies, 83 (3): 279–89. https://doi.org/10.1016/S0006-3207(97)00084-0.
Chambert, Thierry, Evan H. Campbell Grant, David A. W. Miller, James D. Nichols, Kevin P. Mulder, and Adrianne B. Brand. 2018. “Two-Species Occupancy Modelling Accounting for Species Misidentification and Non-Detection.” Methods in Ecology and Evolution 9 (6): 1468–77. https://doi.org/https://doi.org/10.1111/2041-210X.12985.
Christin, Sylvain, Éric Hervet, and Nicolas Lecomte. 2019. “Applications for Deep Learning in Ecology.” Edited by Hao Ye. Methods in Ecology and Evolution 10 (10): 1632–44. https://doi.org/10.1111/2041-210X.13256.
Clipp, Hannah L., Amber L. Evans, Brin E. Kessinger, K. Kellner, and Christopher T. Rota. 2021. “A Penalized Likelihood for Multi-Species Occupancy Models Improves Predictions of Species Interactions.” Ecology.
Dai, Bin, Shilin Ding, and Grace Wahba. 2013. “Multivariate Bernoulli Distribution.” Bernoulli 19 (4). https://doi.org/10.3150/12-BEJSP10.
Duggan, Matthew T., Melissa F. Groleau, Ethan P. Shealy, Lillian S. Self, Taylor E. Utter, Matthew M. Waller, Bryan C. Hall, Chris G. Stone, Layne L. Anderson, and Timothy A. Mousseau. 2021. “An Approach to Rapid Processing of Camera Trap Images with Minimal Human Input.” Ecology and Evolution. https://doi.org/https://doi.org/10.1002/ece3.7970.
Fiske, Ian, and Richard Chandler. 2011. unmarked: An R Package for Fitting Hierarchical Models of Wildlife Occurrence and Abundance.” Journal of Statistical Software 43 (10): 1–23. https://www.jstatsoft.org/v43/i10/.
Gimenez, Olivier, Stephen T. Buckland, Byron J. T. Morgan, Nicolas Bez, Sophie Bertrand, Rémi Choquet, Stéphane Dray, et al. 2014. “Statistical Ecology Comes of Age.” Biology Letters 10 (12): 20140698. https://doi.org/10.1098/rsbl.2014.0698.
He, Kaiming, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. “Deep Residual Learning for Image Recognition.” In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 770–78. https://doi.org/10.1109/CVPR.2016.90.
Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. 2012. “ImageNet Classification with Deep Convolutional Neural Networks.” In Advances in Neural Information Processing Systems 25, edited by F. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, 1097–1105. Curran Associates, Inc.
Lahoz-Monfort, José J, and Michael J L Magrath. 2021. “A Comprehensive Overview of Technologies for Species and Habitat Monitoring and Conservation.” BioScience. https://doi.org/10.1093/biosci/biab073.
Lai, Jiangshan, Christopher J. Lortie, Robert A. Muenchen, Jian Yang, and Keping Ma. 2019. “Evaluating the Popularity of R in Ecology.” Ecosphere 10 (1). https://doi.org/10.1002/ecs2.2567.
Lamba, Aakash, Phillip Cassey, Ramesh Raja Segaran, and Lian Pin Koh. 2019. “Deep Learning for Environmental Conservation.” Current Biology 29 (19): R977–82. https://doi.org/10.1016/j.cub.2019.08.016.
LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015. “Deep Learning.” Nature 521 (7553): 436–44. https://doi.org/10.1038/nature14539.
Miller, David A., James D. Nichols, Brett T. McClintock, Evan H. Campbell Grant, Larissa L. Bailey, and Linda A. Weir. 2011. “Improving Occupancy Estimation When Two Types of Observational Error Occur: Non-Detection and Species Misidentification.” Ecology 92 (7): 1422–28. https://doi.org/https://doi.org/10.1890/10-1396.1.
Molinari-Jobin, Anja, Fridolin Zimmermann, Andreas Ryser, Christine Breitenmoser-Würsten, Simon Capt, Urs Breitenmoser, Paolo Molinari, Heinrich Haller, and Roman Eyholzer. 2007. “Variation in Diet, Prey Selectivity and Home-Range Size of Eurasian Lynx Lynx Lynx in Switzerland.” Wildlife Biology 13 (4): 393–405. https://doi.org/10.2981/0909-6396(2007)13[393:VIDPSA]2.0.CO;2.
Norouzzadeh, Mohammad Sadegh, Dan Morris, Sara Beery, Neel Joshi, Nebojsa Jojic, and Jeff Clune. 2021. “A Deep Active Learning System for Species Identification and Counting in Camera Trap Images.” Edited by Matthew Schofield. Methods in Ecology and Evolution 12 (1): 150–61. https://doi.org/10.1111/2041-210X.13504.
Paszke, Adam, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, et al. 2019. “PyTorch: An Imperative Style, High-Performance Deep Learning Library.” In Advances in Neural Information Processing Systems 32, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. dAlché-Buc, E. Fox, and R. Garnett, 8024–35. Curran Associates, Inc. http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf.
Rota, Christopher T., Marco A. R. Ferreira, Roland W. Kays, Tavis D. Forrester, Elizabeth L. Kalies, William J. McShea, Arielle W. Parsons, and Joshua J. Millspaugh. 2016. “A Multispecies Occupancy Model for Two or More Interacting Species.” Methods in Ecology and Evolution 7 (10): 1164–73. https://doi.org/https://doi.org/10.1111/2041-210X.12587.
Shao, Ling, Fan Zhu, and Xuelong Li. 2015. “Transfer Learning for Visual Categorization: A Survey.” IEEE Transactions on Neural Networks and Learning Systems 26 (5): 1019–34. https://doi.org/10.1109/TNNLS.2014.2330900.
Shorten, Connor, and Taghi M. Khoshgoftaar. 2019. “A Survey on Image Data Augmentation for Deep Learning.” Journal of Big Data 6 (1): 60. https://doi.org/10.1186/s40537-019-0197-0.
Strubell, Emma, Ananya Ganesh, and Andrew McCallum. 2019. “Energy and Policy Considerations for Deep Learning in NLP.” arXiv:1906.02243. http://arxiv.org/abs/1906.02243.
Sutherland, William J., Robert P. Freckleton, H. Charles J. Godfray, Steven R. Beissinger, Tim Benton, Duncan D. Cameron, Yohay Carmel, et al. 2013. “Identification of 100 Fundamental Ecological Questions.” Edited by David Gibson. Journal of Ecology 101 (1): 58–67. https://doi.org/10.1111/1365-2745.12025.
Tabak, Michael A., Mohammad S. Norouzzadeh, David W. Wolfson, Erica J. Newton, Raoul K. Boughton, Jacob S. Ivan, Eric A. Odell, et al. 2020. “Improving the Accessibility and Transferability of Machine Learning Algorithms for Identification of Animals in Camera Trap Images: MLWIC2.” Ecology and Evolution 10 (19): 10374–83. https://doi.org/10.1002/ece3.6692.
Tabak, Michael A., Mohammad S. Norouzzadeh, David W. Wolfson, Steven J. Sweeney, Kurt C. Vercauteren, Nathan P. Snow, Joseph M. Halseth, et al. 2019. “Machine Learning to Classify Animal Species in Camera Trap Images: Applications in Ecology.” Edited by Theoni Photopoulou. Methods in Ecology and Evolution 10 (4): 585–90. https://doi.org/10.1111/2041-210X.13120.
Vandel, Jean-Michel, and Philippe Stahl. 2005. “Distribution Trend of the Eurasian Lynx Lynx Lynx Populations in France.” Mammalia 69 (2). https://doi.org/10.1515/mamm.2005.013.
Voulodimos, Athanasios, Nikolaos Doulamis, Anastasios Doulamis, and Eftychios Protopapadakis. 2018. “Deep Learning for Computer Vision: A Brief Review.” Edited by Diego Andina. Computational Intelligence and Neuroscience 2018 (February): 7068349. https://doi.org/10.1155/2018/7068349.
Wearn, Oliver R., Robin Freeman, and David M. P. Jacoby. 2019. “Responsible AI for Conservation.” Nature Machine Intelligence 1 (2): 72–73. https://doi.org/10.1038/s42256-019-0022-7.
Weinstein, Ben G. 2018. “A Computer Vision for Animal Ecology.” Edited by Laura Prugh. Journal of Animal Ecology 87 (3): 533–45. https://doi.org/10.1111/1365-2656.12780.
Willi, Marco, Ross T. Pitman, Anabelle W. Cardoso, Christina Locke, Alexandra Swanson, Amy Boyer, Marten Veldthuis, and Lucy Fortson. 2019. “Identifying Animal Species in Camera Trap Images Using Deep Learning and Citizen Science.” Edited by Oscar Gaggiotti. Methods in Ecology and Evolution 10 (1): 80–91. https://doi.org/10.1111/2041-210X.13099.
Yosinski, Jason, Jeff Clune, Yoshua Bengio, and Hod Lipson. 2014. “How Transferable Are Features in Deep Neural Networks?” In Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 2, 3320–28. NIPS’14. Cambridge, MA, USA: MIT Press.
Zimmermann, Fridolin, Christine Breitenmoser-Würsten, Anja Molinari-Jobin, and Urs Breitenmoser. 2013. “Optimizing the Size of the Area Surveyed for Monitoring a Eurasian Lynx (Lynx Lynx) Population in the Swiss Alps by Means of Photographic Capture-Recapture.” Integrative Zoology 8 (3): 232–43. https://doi.org/10.1111/1749-4877.12017.

Session information

R version 4.5.0 (2025-04-11)
Platform: x86_64-pc-linux-gnu
Running under: Ubuntu 24.04.2 LTS

Matrix products: default
BLAS:   /usr/lib/x86_64-linux-gnu/blas/libblas.so.3.12.0 
LAPACK: /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3.12.0  LAPACK version 3.12.0

locale:
 [1] LC_CTYPE=C.UTF-8       LC_NUMERIC=C           LC_TIME=C.UTF-8       
 [4] LC_COLLATE=C.UTF-8     LC_MONETARY=C.UTF-8    LC_MESSAGES=C.UTF-8   
 [7] LC_PAPER=C.UTF-8       LC_NAME=C              LC_ADDRESS=C          
[10] LC_TELEPHONE=C         LC_MEASUREMENT=C.UTF-8 LC_IDENTIFICATION=C   

time zone: Etc/UTC
tzcode source: system (glibc)

attached base packages:
[1] stats     graphics  grDevices datasets  utils     methods   base     

other attached packages:
 [1] reticulate_1.42.0 exifr_0.3.2       unmarked_1.5.0    cvms_1.7.0       
 [5] janitor_2.2.1     highcharter_0.9.4 fastai_2.2.2      ggtext_0.1.2     
 [9] wesanderson_0.3.7 kableExtra_1.4.0  stringi_1.8.7     cowplot_1.1.3    
[13] sf_1.0-21         lubridate_1.9.4   forcats_1.0.0     stringr_1.5.1    
[17] dplyr_1.1.4       purrr_1.0.4       readr_2.1.5       tidyr_1.3.1      
[21] tibble_3.3.0      ggplot2_3.5.2     tidyverse_2.0.0  

loaded via a namespace (and not attached):
 [1] Rdpack_2.6.4       DBI_1.2.3          rlang_1.1.6        magrittr_2.0.3    
 [5] snakecase_0.11.1   e1071_1.7-16       compiler_4.5.0     png_0.1-8         
 [9] systemfonts_1.2.3  vctrs_0.6.5        pkgconfig_2.0.3    crayon_1.5.3      
[13] fastmap_1.2.0      backports_1.5.0    labeling_0.4.3     rmarkdown_2.29    
[17] markdown_2.0       tzdb_0.5.0         ragg_1.4.0         bit_4.6.0         
[21] xfun_0.52          litedown_0.7       jsonlite_2.0.0     jpeg_0.1-11       
[25] broom_1.0.8        parallel_4.5.0     R6_2.6.1           RColorBrewer_1.1-3
[29] rlist_0.4.6.2      car_3.1-3          Rcpp_1.0.14        assertthat_0.2.1  
[33] knitr_1.50         zoo_1.8-14         Matrix_1.7-3       igraph_2.1.4      
[37] timechange_0.3.0   tidyselect_1.2.1   abind_1.4-8        rstudioapi_0.17.1 
[41] yaml_2.3.10        curl_6.4.0         lattice_0.22-5     plyr_1.8.9        
[45] quantmod_0.4.28    withr_3.0.2        evaluate_1.0.4     units_0.8-7       
[49] proxy_0.4-27       xts_0.14.1         xml2_1.3.8         ggpubr_0.6.1      
[53] pillar_1.10.2      carData_3.0-5      KernSmooth_2.23-26 checkmate_2.3.2   
[57] renv_1.1.4         reformulas_0.4.1   generics_0.1.4     TTR_0.24.4        
[61] vroom_1.6.5        hms_1.1.3          commonmark_1.9.5   scales_1.4.0      
[65] class_7.3-23       glue_1.8.0         tools_4.5.0        data.table_1.17.6 
[69] ggsignif_0.6.4     grid_4.5.0         rbibutils_2.3      Formula_1.2-5     
[73] cli_3.6.5          rappdirs_0.3.3     textshaping_1.0.1  viridisLite_0.4.2 
[77] svglite_2.2.1      gtable_0.3.6       rstatix_0.7.2      digest_0.6.37     
[81] classInt_0.4-11    htmlwidgets_1.6.4  farver_2.1.2       htmltools_0.5.8.1 
[85] lifecycle_1.0.4    gridtext_0.1.5     bit64_4.6.0-1      MASS_7.3-65       

Acknowledgments

We warmly thank Mathieu Massaviol, Remy Dernat and Khalid Belkhir for their help in using GPU machines on the Montpellier Bioinformatics Biodiversity platform, Julien Renoult for helpful discussions, Delphine Dinouart and Chloé Quillard for their precious help in manually tagging the images, and Vincent Miele for having inspired this work, and his help and support along the way. We also thank the staff of the Federations of Hunters from the Jura and Ain counties, hunters who helped to find locations for camera traps and volunteers who contributed in collecting data. Our thanks also go to Hannah Clipp, Chris Rota and Ken Kellner for sharing a development version of unmarked, and an unpublished version of their paper. The Lynx Predator Prey Program was funded by Auvergne-Rhône-Alpes Region, Ain and Jura departmental Councils, The French National Federation of Hunters, French Environmental Ministry based in Auvergne-Rhone-Alpes and Bourgogne Franche-Comté Region and the French Office for Biodiversity. Our work was also partly funded by the French National Research Agency (grant ANR-16-CE02-0007).

Reuse

Citation

BibTeX citation:
@article{gimenez2022,
  author = {Gimenez, Olivier and Kervellec, Maëlis and Fanjul,
    Jean-Baptiste and Chaine, Anna and Marescot, Lucile and Bollet,
    Yoann and Duchamp, Christophe},
  publisher = {French Statistical Society},
  title = {Trade-Off Between Deep Learning for Species Identification
    and Inference about Predator-Prey Co-Occurrence},
  journal = {Computo},
  date = {2022-04-22},
  doi = {10.57750/yfm2-5f45},
  issn = {2824-7795},
  langid = {en},
  abstract = {Deep learning is used in computer vision problems with
    important applications in several scientific fields. In ecology for
    example, there is a growing interest in deep learning for
    automatizing repetitive analyses on large amounts of images, such as
    animal species identification. However, there are challenging issues
    toward the wide adoption of deep learning by the community of
    ecologists. First, there is a programming barrier as most algorithms
    are written in `Python` while most ecologists are versed in `R`.
    Second, recent applications of deep learning in ecology have focused
    on computational aspects and simple tasks without addressing the
    underlying ecological questions or carrying out the statistical data
    analysis to answer these questions. Here, we showcase a reproducible
    `R` workflow integrating both deep learning and statistical models
    using predator-prey relationships as a case study. We illustrate
    deep learning for the identification of animal species on images
    collected with camera traps, and quantify spatial co-occurrence
    using multispecies occupancy models. Despite average model
    classification performances, ecological inference was similar
    whether we analysed the ground truth dataset or the classified
    dataset. This result calls for further work on the trade-offs
    between time and resources allocated to train models with deep
    learning and our ability to properly address key ecological
    questions with biodiversity monitoring. We hope that our
    reproducible workflow will be useful to ecologists and applied
    statisticians.}
}
For attribution, please cite this work as:
Gimenez, Olivier, Maëlis Kervellec, Jean-Baptiste Fanjul, Anna Chaine, Lucile Marescot, Yoann Bollet, and Christophe Duchamp. 2022. “Trade-Off Between Deep Learning for Species Identification and Inference about Predator-Prey Co-Occurrence.” Computo, April. https://doi.org/10.57750/yfm2-5f45.