Radiation Measurements 151 (2022) 106706
Contents lists available at ScienceDirect
Radiation Measurements
journal homepage: www.elsevier.com/locate/radmeas
Image sorting of nuclear reactions recorded on CR-39 nuclear track detector
using deep learning
Ken Tashiro a, *, Kazuki Noto a, Quazi Muhammad Rashed Nizam b, Eric Benton c,
Nakahiro Yasuda a
a
b
c
Research Institute of Nuclear Engineering, University of Fukui, Tsuruga, Fukui, Japan
Department of Physics, University of Chittagong, Chittagong, Bangladesh
Department of Physics, Oklahoma State University, Stillwater, OK, USA
A R T I C L E I N F O
A B S T R A C T
Keywords:
CR-39 nuclear track detector
Deep learning
Object detection
Image merging
Total charge changing cross-section
Deep learning has been utilized to trace nuclear reactions in the CR-39 nuclear track detector. Etch pit images on
front and back surfaces of the CR-39 detector were obtained sequentially by moving the objective lens of a
microscope, and merged to one image. This image merging makes it possible to combine information on the
displacement of the position of the etch pits produced by single particle traversals through a CR-39 layer in a
single image, thereby making it easier to recognize corresponding nuclear fragmentation reactions. Object
detection based on deep learning has been applied to the merged image to identify nuclear fragmentation events
for measurement of the total charge changing cross-section based on the number of incident particles (Nin) and
the number of particles that passed through target without any nuclear reaction (Nout). We verified the accuracy
(correct answer rate) of algorithms for extracting the two patterns of etch pit in merged images which corre
sponds to Nin and Nout using the learning curves expressed as a function of the number of trainings. Accuracy of
Nin and Nout were found to be 97.3 ± 4.0% and 98.0 ± 4.0%, respectively. These results show that the object
detection algorithm based on the deep learning can be a strong tool for CR-39 etch pit analysis.
1. Introduction
CR-39 solid-state nuclear track detector has been a powerful tool to
measure the total charge changing cross-section (Golovchenko et al.,
2001, 2002; Cecchini et al., 2008; Duan et al., 2021; Huo et al., 2019;
Zheng et al., 2021) and fragment emission angles (Giacomelli et al.,
2004; Sihver et al., 2013; Zhang et al., 2018), since it has high charge
resolution (Ota et al., 2011) and has the potential to accurately measure
fragment emission angles (Rashed-Nizam et al., 2020). In this experi
mental application, CR-39 nuclear track detector is frequently used not
only as a detector but also as a target (material to be verified the
cross-sections) since etch pits appear along the ion track on both the
front and back surfaces after chemical etching. The front and back sur
face images are independently captured by a microscope. These images
are analyzed to extract the position and size of the etch pits to trace
charged particle’s trajectory in the target (Yasuda et al., 2005, 2009). By
matching the positions of the etch pits obtained from independently
obtained images on the front and back surfaces of the detector, it has
been possible to identify particles that have passed through the target or
have undergone a nuclear reaction within the target (Skvarˇc and
Golovchenko, 2001; Ota et al., 2008). In order to identify a nuclear re
action, it is necessary to establish a one-to-one correspondence between
the etch pits on the detector’s front and the back surfaces. The matching
method requires accurate alignment of the etch pits on both surfaces,
and the alignment accuracy is estimated to be 2–3 μm (Ota et al., 2008).
The accuracy of this alignment puts a limit on the matching method.
Recently, we have developed a technology to takes images on the front
and back surfaces of the CR-39 detector sequentially by moving the
objective lens of a microscope without any treatment for alignment of
images (Rashed-Nizam et al., 2020). By this technology, this alignment
(matching) error is only due to the verticality of the Z-axis movement of
the microscope; the accuracy is to be within 1 pixel (0.24 μm in this
case). As a feasibility study, we applied object detection based on deep
learning which can simultaneously classify nuclear reaction images and
detect object positions.
Object detection is a computer technology that determines whether
* Corresponding author. Research Institute of Nuclear Engineering, University of Fukui, 1-3-33 Kanawa, 914-0055, Tsuruga, Fukui, Japan.
E-mail address: (K. Tashiro).
/>Received 10 September 2021; Received in revised form 15 January 2022; Accepted 19 January 2022
Available online 22 January 2022
1350-4487/© 2022 The Authors.
Published by Elsevier Ltd.
This is an open
( />
access
article
under
the
CC
BY-NC-ND
license
K. Tashiro et al.
Radiation Measurements 151 (2022) 106706
Fig. 1. Images of the front (a) and back (b) surface were merged into the merged image (c) by image subtraction, after adding 200 (gray level) to each pixel value of
the back image. In the merged image (c), white circles represent the etch pits on the front surface, black circles represent the etch pits on the back surface.
were cut into 50 mm × 50 mm squares and exposed to a 55 MeV/
nucleon 12C beam with a particle density of 2500 ions/cm2 at the
Wakasa Wan Energy Research Center (WERC) (Hatori et al., 2001). After
irradiation, the detector was etched in 7 N NaOH solution at 70 ◦ C for 20
h. After chemical etching, images of the front and back surfaces of the
CR-39 detector were acquired using a FSP-1000 imaging microscope,
manufactured by SEIKO Time Creation Inc. The autofocus system of the
microscope was used to capture images of the front and back surfaces.
After capturing an image on the front surface, the objective lens moves
to a lower depth (Z-axis of microscope system) of the CR-39 detector,
and the back surface image is captured for the same field of view
(Rashed-Nizam et al., 2020). The images of both surfaces (2500 pixels ×
1800 pixels) were obtained using a 20× magnification objective lens
with a pixel size of 0.24 μm × 0.24 μm. These images are represented by
a value from black (0) to white (255) in a grayscale image with 256 gray
levels.
objects of a given class (such as humans, cars, or buildings) are present
in digital images and movies. When there are the objects, it returns the
spatial location and size of each object as a result (Liu et al., 2020). This
technology has been researched based on human-designed features in
the field of computer vision for the development of technologies such as
face recognition (Viola and Jones., 2004). The advent of deep learning
techniques, a method of automatically learning features from data, has
improved object detection technology in various research fields (LeCun
et al., 2015). Performance of object detection is improving annually by
incorporating deep learning technology (Liu et al., 2020; Zou et al.,
2019).
Recent studies in the field of radiation measurement are also
advancing research that applies deep learning technology, such as a new
method of visualizing the ambient dose rate distribution using artificial
neural networks from airborne radiation monitoring results (Sasaki
et al., 2021). Methods have been developed to analyze radon time-series
sampling data by machine learning and analyze its relationship with
environmental factors (Janik et al., 2018; Hosoda et al., 2020). For de
tectors that require image analysis such as the nuclear emulsion and the
fluorescent nuclear track detector (FNTD), analysis methods based on
image classification using deep learning have also been developed. For
nuclear emulsion, an efficient classifier was developed that sorts
alpha-decay events from various vertex-like objects in an emulsion using
a convolutional neural network (Yoshida et al., 2021). For FNTD, an
image processing technique involving convolutional neural networks
has been demonstrated for neutron dosimetry applications (Akselrod
et al., 2020).
In this study, we have developed a new methodology for tracing ion
track penetration by merging images on both sides of a CR-39 detector
without relying on pattern matching. Instead, object detection based on
deep learning is applied to the etch pit analysis.
2.2. Image merging of front and back surfaces on CR-39 detector
We have employed image merging which is a method of detecting
moving objects by comparing the observed image with the background
image. As shown in Fig. 1, front (a) and back (b) images of the CR-39
detector were acquired from the microscope. By subtracting each pixel
value of the front image from each pixel value of the back image added
200 (gray level), we created a merged image (c).
In the merged image, white and black circles represent the etch pits
on the front and back surfaces, respectively. The displacement of black
and white etch pits position indicates that the ions penetrated the CR-39
detector with a small angle. Here, it is easy to discriminate the corre
sponding etch pits formed on the front and back surfaces by the passage
of an incident ion without treatment of pattern matching by the align
ment between the front and back surfaces. This method is able to pro
duce incident angle information by the displacement with distance
(thickness) between front and back surface as described elsewhere
(Rashed-Nizam et al., 2020), and also to indicate the presence or
absence of nuclear reactions in the single image.
Fig. 2 shows examples of etch pits in the merged image. Track events
are classified into three categories: (a) the projectile passed through CR-
2. Materials and methods
2.1. Experimental
We used CR-39 detector (HARZLAS TD-1) manufactured by Fukuvi
Chemical Industry Co., Ltd.. Layers of CR-39 detector (0.45 mm thick)
Fig. 2. Examples of etch pits in the merged image: (a) the projectile passed through the CR-39 detector without any reaction, producing two etch pits on both
surfaces (white and black); (b) the projectile decays into several lighter fragments, and these fragments are not detected due to the detection threshold; (c) nuclear
fragments are observed as three tracks indicated by white arrows.
2
K. Tashiro et al.
Radiation Measurements 151 (2022) 106706
Fig. 3. The learning curves of the (a) W/B and (b) W object extraction algorithms, respectively. The accuracies (in %) of these algorithms are shown as a function of
the number of training datasets.
39 detector without producing any nuclear reaction; (b) no etch pits are
observed on the back surface - one of possible reactions is C→6p+6n,
where protons and neutrons are out of detection due to the detection
threshold (Yasuda et al., 2008; Kodaira et al., 2016); (c) three etch pits
are observed on the back surface and assumed to be the results of a re
action, e.g., C→3α, where these α-particles have sufficient energy loss to
be detected.
Here, the total charge changing cross-section (σTCC ) expresses the
probability that the projectile changes its charge due to the nuclear
interaction between the projectile and the target. σ TCC is dominated by
the total cross section σT and is defined as (Golovchenko et al., 2002)
σ TCC = σ T − σel − σ nr ,
pixels × 416 pixels), as shown in Fig. 2(a), contain various patterns
based on the differences of the position between the etch pits on both
surfaces and the distance between their centers. The W images contain
white etch pits images as shown in Fig. 2(b). Thus, the validation dataset
and two training datasets were used separately to train the object
detection algorithms for counting of Nout and Nin.
For object detection, we adopted YOLOv3 (Redmon and Farhadi,
2018), an object detection algorithm based on convolutional neural
networks, and used Python 3.7.11 as a machine learning package with
the machine learning framework “Darknet”, and Open CV 4.1.2 on the
execution environment “Google Colaboratory” (Bisong, 2019). The ob
ject detection algorithms were trained by inputting the training dataset
(W/B) to extract the W/B objects from the validation dataset. The in
dividual algorithms were prepared according to the number of training
datasets varied from N = 100 to 1200. We applied these algorithms to
the validation dataset and counted the number of W/B objects detected
by each algorithm. Accuracy was defined as the ratio of the number of
W/B objects detected by the algorithms and the number of W/B objects
in the validation dataset (1227), by the following equation (3):
(1)
where σ el is the elastic cross-section and σ nr is the neutron removal crosssection. Using measurable quantities, the σTCC also expresses as
σ TCC = −
M
ρ NA X
Nout
),
ln(
Nin
(2)
where, NA, ρ, X and M indicate Avogadro’s number, the density and the
thickness of the target, and its atomic or molecular mass, respectively
(Cecchini et al., 2008; Huo et al., 2019). Nin is the number of incident
particles and Nout is the number of particles that have passed through
target without undergoing any reaction.
The σ TCC can be expressed as the ratio of number of ions that passed
through the detector without any nuclear reaction to the number of
incident ions that enter the CR-39 detector, i.e., σTCC ∝ Nout/Nin. In short,
the number of the white etch pits (Nin) in Fig. 2(b) and the number of
black etch pits (Nout) as in Fig. 2(a) essential in determining the total
charge changing cross-section.
Accuracy [%] =
The number of detected objects by algorithm
× 100
The number of objects in validation dataset
(3)
In a similar manner to the verification of Nout, the accuracy of Nin was
also verified by the algorithms using the training dataset (W) and the
validation dataset.
3. Results and discussions
3.1. Object detection accuracy and error estimation of image sorting
As an evaluation of the algorithm, we employed a learning curve
which shows predictive accuracy on the test examples as a function of
the number of training examples (Perlich, 2011). Fig. 3(a) shows the
learning curve for the W/B object extraction algorithm. The accuracy (in
%) is shown as a function of the number of training datasets. The ac
curacy increased as the number of training datasets increased, reaching
a maximum of 98.0 ± 4.0% calculated from the number of detected W/B
objects (1203) and W/B objects (1227) in the validation dataset after
1000 trainings. Fig. 3(b) shows the learning curve of the W object
extraction algorithm. The accuracy improved as the number of training
datasets increased, reaching 97.3 ± 4.0% calculated from the number of
detected W objects (1196) and W objects (1229) in the validation
dataset in 1000 trainings. Errors in the accuracy are statistical errors
calculated from the ratio of the number of W/B (W) objects detected by
the algorithms and the number of W/B (W) objects in the validation
dataset. The accuracies as shown in Fig. 3 (a) and (b) were saturated
2.3. Object detection for etch pit image
To verify the quantities Nout and Nin from etch pits images, a vali
dation dataset and two training datasets were created from the merged
images. The validation dataset consists of 256 merged images (2500
pixels × 1800 pixels) and includes the white and black etch pits (W/B
objects), similar to Fig. 2(a), which were visually counted to be 1227 in
those 256 images. This dataset also includes 0 objects for Fig. 2(b) type
image and 2 objects for Fig. 2(c) type image. The validation dataset then
consists of 1229 white etch pit objects (1227 W/B and 2 W objects) as
described above.
The training datasets (W/B and W) were prepared from merged
images other than the validation dataset. Those two training datasets
consisted of 1200 white and black etch pits images (W/B images) and
white etch pits images (W images), respectively. W/B images (416
3
K. Tashiro et al.
Radiation Measurements 151 (2022) 106706
Fig. 4. Four types of undetected objects: (a) the W/B etch pits are close each other; (b) the distance between the two etch pits is greater than expected due to multiple
Coulomb scattering; (c) multiple W/B objects are overlapping; and (d) the W/B object locate at the edge of the image.
with 97–98%. The accuracies were also repeated rising and falling ac
cording to increasing training dataset. This phenomenon, often observed
in deep learning, is called overfitting (overtraining) which is a funda
mental problem in applying deep learning (Salman and Liu, 2019; Ying,
2019). It may indicate that the algorithm was optimized only for the
training dataset and that this optimization did not generalize to the
validation dataset. Various approaches are proposed to reduce this effect
such as changes of the neural network architecture in the algorithm and
expansion of the training dataset which includes more highly-varied
images (Ying, 2019). These approaches are expected to improve accu
racy in the future.
The maximum, statistical W/B and W object detection accuracies
were found to be 98.0 ± 4.0% and are to be 97.3 ± 4.0% at N = 1000,
respectively. Errors in accuracy are statistical errors individually
calculated from the ratio of the number of W/B (W) objects detected by
the algorithms and the constant number of 1227 W/B and 1229 W ob
jects in the validation dataset. As a result, the statistical errors in these
figures vary between 3.5 and 4.0. These statistical errors can be
improved by increasing the number of validation datasets with suitable
numbers of trainings. On the other hand, the systematic error is due to
the fact that the results vary due to the creation of different learning
algorithms depending on how the training dataset is selected. Here, we
evaluated how much the results would vary by randomly selecting from
1200 when extracting 1000 training datasets. For each of the training
datasets (W/B and W), the training dataset was extracted and applied to
the validation dataset only after the algorithm was created. This was
repeated ten times to determine the accuracies and evaluate standard
deviation of the systematic errors for both W/B and W dataset. The
systematic errors (1σ) were estimated to be 0.6% and 0.7% for W/B and
W objects, respectively. The degree of contribution of the systematic
errors to the charge changing cross-section is expressed as follows:
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
ΔNin(sys) 2
ΔNout(sys) 2
M
Δσ TCC(sys) =
(4)
) +(
),
(
ρN A X
Nin
Nout
Fig. 5. Examples of undetected W objects. Etch pits due to α-particle from the
environment (a), and dust or tiny scratches on the surface of CR-39 (b) indi
cated by arrows are imaged near the etch pit.
center-to-center distance was 2.7 μm, while the radius of the white etch
pit was 13.4 μm in this case. (b) The distance between the two etch pits is
greater than expected, as a result of multiple Coulomb scattering
(Highland, 1975; Beringer et al., 2012). In this case, W/B object detec
tion succeeded, but is recognized as chance coincidence of irrelevant
etch pits since the calculated distance by the scattering was to be 38.3
μm and the center-to-center distance was 43.8 μm. (c) Multiple W/B
objects are overlapping. This is an essential limitation of the CR-39
detection technique that should be improved by reduction of exposure
density and/or by shortening the etching time to avoid overlapping etch
pits. (d) The W/B object is located at the edge of the image and also
cannot be processed by pattern matching. It is necessary to take mea
sures to reduce the relative number of objects located at the image edges
by increasing the validation image size. The (c) and (d) cases require
additional processing in order to be included in the cross-section
measurement.
On the other hand, for W objects, undetected objects (2.7% of total W
object) were classified into two types, as shown in Fig. 5(a) and (b). Etch
pits due to α-particles from the environment (a) and dust or tiny
scratches on the surface of CR-39 (b) are imaged near the detection
target (W object). Improvements such as shortening the exposure time to
the environment and handling without damaging the surface can be
considered. As a further usage of deep learning, in addition to the al
gorithm for extracting etch pits, it is possible to create an algorithm to
distinguish the etch pit from noises.
The conventional pattern matching method requires the measure
ment of etch pits from images obtained of both surfaces of the CR-39
detector and execution of the pattern matching algorithm within the
measurement accuracy of 2–3 μm (Ota et al., 2008). In the new method,
the alignment (matching) error is negligible as described in Section 1
such that we need to consider the multiple Coulomb scattering. The
presence or absence of a nuclear reaction can also be determined with
high accuracy by using an image in which the front and back are merged
where ΔNin(sys) = 0.007 × Nin and ΔNout(sys) = 0.006 × Nout. As a
result, the systematic error of the charge changing cross-section can be
calculated as the sum of squares of these errors, resulting in ±0.9% (1σ).
It should be pointed out that these statistical and systematic errors can
be affected by the etch pit density and etching conditions (size of the
etch pit), and it is necessary to optimize the algorithm for each set of
conditions.
3.2. Classification of undetected objects for further improvements
The characteristics of the etch pits that could not be detected were
classified for further improvement of the accuracy. Undetected objects
(2% of total W/B object) were classified into four types as shown in
Fig. 4. (a) The W/B etch pits are close to each other and might be
recognized as W objects since the distance between the centers of the
two etch pits is shorter than the radii of each individual etch pit. The
4
Radiation Measurements 151 (2022) 106706
K. Tashiro et al.
and performing object detection of it. As a result, the total charge
changing cross-section can be determined by sorting the etch pits in the
merged image. In addition, it is possible to apply the precise measure
ment for the emission angle of particles from the nuclear reaction using
this method, and will be discussed in detail elsewhere.
Hatashita, M., Yamada, M., Yamada, H., Dote, M., Ohtani, N., Kakiuchi, S.,
Tominaga, Y., Fukumoto, S., Kondo, M., 2001. Accelerator system at the Wakasawan energy research center. AIP Conf. Proc. 576, 631–634. />10.1063/1.1395388.
Highland, V.L., 1975. Some practical remarks on multiple scattering. Nucl. Instrum.
Methods 129, 497–499. />Hosoda, M., Tokonami, S., Suzuki, T., Janik, M., 2020. Machine learning as a tool for
analysing the impact of environmental parameters on the radon exhalation rate from
soil. Radiat. Meas. 138, 106402 />Huo, L.D., Wang, L.H., Zhu, J.H., Li, H.L., Li, J.S., Kodaira, S., Yasuda, N., Zhang, D.H.,
2019. The total charge-changing cross sections and the partial cross sections of 56Fe
fragmentation on Al, C and CH2 targets. Chin. J. Phys. 60, 88–97. />10.1016/j.cjph.2019.04.022.
Janik, M., Bossew, P., Kurihara, O., 2018. Machine learning methods as a tool to analyse
incomplete or irregularly sampled radon time series data. Sci. Total Environ. 630,
1155–1167. />Kodaira, S., Morishige, K., Kawashima, H., Kitamura, H., Kurano, M., Hasebe, N.,
Koguchi, Y., Shinozaki, W., Ogura, K., 2016. A performance test of a new highsurface-quality and high-sensitivity CR-39 plastic nuclear track detector –
TechnoTrak. Nucl. Instrum. Method B 383, 129–135. />nimb.2016.07.002.
LeCun, Y., Bengio, Y., Hinton, G., 2015. Deep learning. Nature 521, 436–444. https://
doi.org/10.1038/nature14539.
Liu, L., Ouyang, W., Wang, X., Fieguth, P., Chen, J., Liu, X., Pietkă
ainen, M., 2020. Deep
learning for generic object detection: a survey. Int. J. Comput. Vis. 128, 261–318.
/>Ota, S., Kodaira, S., Yasuda, N., Benton, E.R., Hareyama, M., Kurano, M., Sato, M.,
Shu, D., Hasebe, N., 2008. Tracking method for the measurement of projectile charge
changing cross-section using CR-39 detector with a high speed imaging microscope.
Radiat. Meas. 43, 195–198. />Ota, S., Yasuda, N., Sihver, L., Kodaira, S., Kurano, M., Naka, S., Ideguchi, Y., Benton, E.
R., Hasebe, N., 2011. Charge resolution of CR-39 plastic nuclear track detectors for
intermediate energy heavy ions. Nucl. Instrum. Methods B 269 (12), 1382–1388.
/>Perlich, C., 2011. Learning curves in machine learning. In: Sammut, C., Webb, G.I. (Eds.),
Encyclopedia of Machine Learning. Springer, Boston, MA. />978-0-387-30164-8_452.
Rashed-Nizam, Q.M., Yoshida, K., Sakamoto, T., Benton, E., Shiver, L., Yasuda, N., 2020.
High-precision angular measurement of 12C ion interaction using a new imaging
method with a CR-39 detector in the energy range below 100 MeV/nucleon. Radiat.
Meas. 131, 106225 />Redmon, J., Farhadi, A., 2018. Yolov3: an incremental improvement. arXiv preprint
arXiv (1804), 2767.
Salman, S., Liu, X., 2019. Overfitting mechanism and avoidance in deep neural networks.
arXiv preprint arXiv (1901), 6566.
Sasaki, M., Sanada, Y., Katengeza, E.W., Yamamoto, A., 2021. New method for
visualizing the dose rate distribution around the Fukushima Daiichi Nuclear Power
Plant using artificial neural networks. Sci. Rep. 11, 1857 />s41598-021-81546-4.
Sihver, L., Giacomelli, M., Ota, S., Skvarc, J., Yasuda, N., Ilic, R., Kodaira, S., 2013.
Projectile fragment emission angles in fragmentation reactions of light heavy ions in
the energy region <200 MeV/nucleon: experimental study. Radiat. Meas. 48 (1),
73–81. />Skvarˇc, J., Golovchenko, A.N., 2001. A method of trajectory tracing of Z⩽10 ions in the
energy region below 300 MeV/u. Radiat. Meas. 34 (1–6), 113–118. />10.1016/S1350-4487(01)00134-2.
Viola, P., Jones, M.J., 2004. Robust real-time face detection. Int. J. Comput. Vis. 57,
137–154. VISI.0000013087.49260.fb.
Yasuda, N., Namiki, K., Honma, Y., Umeshima, Y., Marumo, Y., Ishii, H., Benton, E.R.,
2005. Development of a high speed imaging microscope and new software for
nuclear track detector analysis. Radiat. Meas. 40, 311–315. />10.1016/j.radmeas.2005.02.013.
Yasuda, N., Zhang, D.H., Kodaira, S., Koguchi, Y., Takebayashi, S., Shinozaki, W.,
Fujisaki, S., Juto, N., Kobayashi, I., Kurano, M., Shu, D., Kawashima, H., 2008.
Verification of angular dependence for track sensitivity on several types of CR-39.
Radiat. Meas. 43, 269–273. />Yasuda, N., Kodaira, S., Kurano, M., Kawashima, H., Tawara, H., Doke, T., Ogura, K.,
Hasebe, N., 2009. High speed microscope for large scale ultra heavy nuclei search
using solid state track detector. J. Phys. Soc. Jpn. 78, 142–145. />10.1143/JPSJS.78SA.142.
Ying, X., 2019. An overview of overfitting and its solutions. J. Phys.: Conf. Ser. 1168
(22022) />Yoshida, J., Ekawa, H., Kasagi, A., Nakagawa, M., Nakazawa, K., Saito, N., Saito, T.R.,
Taki, M., Yoshimoto, M., 2021. CNN-based event classification of alpha-decay events
in nuclear emulsion. Nucl. Instrum. Methods A 989, 164930. />10.1016/j.nima.2020.164930.
Zhang, D.H., Shi, R., Li, J.S., Kodaira, S., Yasuda, N., 2018. Projectile fragment emission
in the fragmentation of 20Ne on C, Al and CH2 targets at 400 MeV/u. Nucl. Instrum.
Methods B 435 (15), 174–179. />Zheng, S.H., Li, W., Gou, C.W., Wu, G.F., Yao, D., Zhang, X.F., Li, J.S., Kodaira, S.,
Zhang, D.H., 2021. Measurement of cross sections for charge pickup by 12C on
elemental targets at 400 MeV/n. Nucl. Phys. 1016, 122317 />10.1016/j.nuclphysa.2021.122317.
Zou, Z., Shi, Z., Guo, Y., Ye, J., 2019. Object detection in 20 Years: a survey. arXiv
preprint arXiv 1905, 5055.
4. Conclusion
We have developed a new tracing method for ion penetration
recorded as etch pits to extract nuclear fragmentation events by merging
microscopic images of the front and back surfaces of an exposed CR-39
detector. This enables us to obtain information on the displacement of
the etch pit position in a single image. We have also applied object
detection based on deep learning to the merged image to identify nu
clear fragmentation events. The accuracy of object detection was eval
uated using a learning curve expressed as a function of the number of
trainings. The accuracy of the algorithms for extracting W/B and W
objects, corresponds to Nout and Nin, essential to measure the total charge
changing cross-section, were verified statistically to be 98.0 ± 4.0% and
97.3 ± 4.0%, respectively, thereby indicating the effectiveness of the
object detection algorithm based on the deep learning to CR-39 particle
detection. We plan to apply this technique in order to measure the total
charge changing cross-section.
Funding
This research did not receive any specific grant from funding
agencies in the public, commercial, or non-profit sectors.
Declaration of competing interest
The authors declare that they have no known competing financial
interests or personal relationships that could have appeared to influence
the work reported in this paper.
Acknowledgment
We would like to thank the WERC personnel for their help and
support during the experiment.
References
Akselrod, M., Fomenko, V., Harrison, J., 2020. Latest advances in FNTD technology and
instrumentation. Radiat. Meas. 133 (106302) />radmeas.2020.106302.
Beringer, J., et al., 2012. Review of Particle Physics (RPP). Phys. Rev., D86 010001.
[Paricle Data Group].
Bisong, E., 2019. Building Machine Learning and Deep Learning Models on Google Cloud
Platform, pp. 59–64. />Cecchini, S., Chiarusi, T., Giacomelli, G., Giorgini, M., Kumar, A., Mandrioli, G.,
Manzoor, S., Margiotta, A.R., Medinaceli, E., Patrizii, L., Popa, V., Qureshi, I.E.,
Sirri, G., Spurio, M., Togo, V., 2008. Fragmentation cross sections of Fe26+, Si14+ and
C6+ ions of 0.3–10 A GeV on polyethylene, CR39 and aluminum targets. Nucl. Phys.
807, 206–213. />Duan, H.R., Wu, J.Y., Ma, T.L., Li, J.S., Li, H.L., Xu, M.M., Yang, R.X., Zhang, D.H.,
Zhang, Z., Wang, Q., Kodaira, S., 2021. Fragmentation of carbon on elemental
targets at 290 AMeV. Int. J. Mod. Phys. E 30 (No. 06), 2150046. />10.1142/S0218301321500464.
Giacomelli, M., Sihver, L., Skvarˇc, J., Yasuda, N., Ilic, R., 2004. Projectilelike fragment
emission angles in fragmentation reactions of light heavy ions in the energy region <
200 MeV/nucleon: modeling and simulations. Phys. Rev. C 69 (64601). https://doi.
org/10.1103/PhysRevC.69.064601.
Golovchenko, A.N., Skvarˇc, J., Yasuda, N., Ili´c, R., Tretyakova, S.P., Ogura, K.,
Murakami, T., 2001. Total charge-changing and partial cross-section measurements
in the reaction of 110 MeV/u 12C with paraffin. Radiat. Meas. 34, 297–300. https://
doi.org/10.1016/S1350-4487(01)00171-8.
Golovchenko, A.N., Skvarˇc, J., Yasuda, N., Giacomelli, M., Tretyakova, S.P., Ili´c, R.,
Bimbot, R., Toulemonde, M., Murakami, T., 2002. Total charge-changing and partial
cross-section measurements in the reactions of ~110-250 MeV/nucleon 12C in
carbon, paraffin, and water. Phys. Rev. C 66 (14609). />PhysRevC.66.014609.
Hatori, S., Ito, Y., Ishigami, R., Yasuda, K., Inomata, T., Maruyama, T., Ikezawa, K.,
Takagi, K., Yamamoto, K., Fukuda, S., Kume, K., Kagiya, G., Hasegawa, T.,
5