Changes
On June 6, 2023 at 3:09:06 PM CDT, kelsey-friesenumanitoba-ca:
-
Updated description of Detection and tracking of belugas, kayaks and motorized boats in drone video using deep learning from
Aerial imagery surveys are commonly used in marine mammal research to determine population size, distribution and habitat use. Analysis of aerial photos involves hours of manually identifying individuals present in each image and converting raw counts into useable biological statistics. Our research proposes the use of deep learning algorithms to increase the efficiency of the marine mammal research workflow. To test the feasibility of this proposal, the existing YOLOv4 convolutional neural network model was trained to detect belugas, kayaks and motorized boats in oblique drone imagery, collected from a stationary tethered system. Automated computer-based object detection achieved the following precision and recall, respectively, for each class: beluga = 74%/72%; boat = 97%/99%; and kayak = 96%/96%. We then tested the performance of computer vision tracking of belugas and occupied watercraft in drone videos using the DeepSORT tracking algorithm, which achieved a multiple-object tracking accuracy (MOTA) ranging from 37% to 88% and multiple object tracking precision (MOTP) between 63% and 86%. Results from this research indicate that deep learning technology can detect and track features more consistently than human annotators, allowing for larger datasets to be processed within a fraction of the time while avoiding discrepancies introduced by labeling fatigue or multiple human annotators.
toAerial imagery surveys are commonly used in marine mammal research to determine population size, distribution and habitat use. Analysis of aerial photos involves hours of manually identifying individuals present in each image and converting raw counts into useable biological statistics. Our research proposes the use of deep learning algorithms to increase the efficiency of the marine mammal research workflow. To test the feasibility of this proposal, the existing YOLOv4 convolutional neural network model was trained to detect belugas, kayaks and motorized boats in oblique drone imagery, collected from a stationary tethered system. Automated computer-based object detection achieved the following precision and recall, respectively, for each class: beluga = 74%/72%; boat = 97%/99%; and kayak = 96%/96%. We then tested the performance of computer vision tracking of belugas and occupied watercraft in drone videos using the DeepSORT tracking algorithm, which achieved a multiple-object tracking accuracy (MOTA) ranging from 37% to 88% and multiple object tracking precision (MOTP) between 63% and 86%. Results from this research indicate that deep learning technology can detect and track features more consistently than human annotators, allowing for larger datasets to be processed within a fraction of the time while avoiding discrepancies introduced by labeling fatigue or multiple human annotators. Résumé Les relevés par imagerie aérienne sont couramment utilisés dans la recherche sur les mammifères marins pour déterminer la taille de la population, sa répartition et l’utilisation de l’habitat. L’analyse des photos aériennes implique des heures d’identification manuelle des individus présents dans chaque image et la conversion des chiffres bruts en statistiques biologiques utilisables. Notre recherche propose l’utilisation d’algorithmes d’apprentissage en profondeur pour augmenter l’efficacité du flux de recherche sur les mammifères marins. Pour mettre à l’essai la faisabilité de cette proposition, le modèle de réseau de neurones à convolution YOLOv4 existant a été entraîné pour détecter les bélugas, les kayaks et les embarcations motorisées dans des images de drones obliques, recueillies à partir d’un système fixe relié. La détection automatisée d’objets par ordinateur a atteint la précision et le rappel suivants, respectivement, pour chaque classe : béluga : 74 %/72 %; bateau : 97 %/99 %; kayak : 96 %/96 %. Les auteurs ont ensuite testé la performance de poursuite au moyen de la vision par ordinateur des bélugas et des motomarines dans des vidéos de drones à l’aide de l’algorithme de poursuite DeepSORT, qui a obtenu une exactitude de poursuite des objets multiples (« MOTA ») allant de 37 à 88 % et une précision de poursuite des objets multiples (« MOTP ») allant de 63 à 86 %. Les résultats de cette recherche indiquent que la technologie d’apprentissage profond peut détecter et suivre les caractéristiques plus régulièrement que les annotateurs humains, permettant de traiter des ensembles de données plus volumineux en une fraction de temps tout en évitant les écarts introduits par la fatigue d’étiquetage ou de multiples annotateurs humains. [Traduit par la Rédaction]
-
Removed the following fields from Detection and tracking of belugas, kayaks and motorized boats in drone video using deep learning
-
awardTitle
-
awardURI
-
funderIdentifier
-
funderIdentifierType
-
funderName
-
funderSchemeURI
-
grantNumber
-
keywords
-
f | 1 | { | f | 1 | { |
2 | "Author": [ | 2 | "Author": [ | ||
3 | { | 3 | { | ||
4 | "affiliation": "Centre for Earth Observation Science - | 4 | "affiliation": "Centre for Earth Observation Science - | ||
5 | University of Manitoba", | 5 | University of Manitoba", | ||
6 | "creatorName": "Harasyn, Madison", | 6 | "creatorName": "Harasyn, Madison", | ||
7 | "email": "Madison.harasyn@umanitoba.ca", | 7 | "email": "Madison.harasyn@umanitoba.ca", | ||
8 | "nameIdentifier": "https://orcid.org/0000-0002-5741-6766", | 8 | "nameIdentifier": "https://orcid.org/0000-0002-5741-6766", | ||
9 | "nameIdentifierScheme": "ORCID", | 9 | "nameIdentifierScheme": "ORCID", | ||
10 | "nameType": "Personal", | 10 | "nameType": "Personal", | ||
11 | "schemeURI": "http://orcid.org/" | 11 | "schemeURI": "http://orcid.org/" | ||
12 | }, | 12 | }, | ||
13 | { | 13 | { | ||
14 | "affiliation": "Centre for Earth Observation Science - | 14 | "affiliation": "Centre for Earth Observation Science - | ||
15 | University of Manitoba", | 15 | University of Manitoba", | ||
16 | "creatorName": "Chan, Wayne", | 16 | "creatorName": "Chan, Wayne", | ||
17 | "email": "wayne.chan@umanitoba.ca", | 17 | "email": "wayne.chan@umanitoba.ca", | ||
18 | "nameIdentifier": "", | 18 | "nameIdentifier": "", | ||
19 | "nameType": "Personal" | 19 | "nameType": "Personal" | ||
20 | }, | 20 | }, | ||
21 | { | 21 | { | ||
22 | "affiliation": "Centre for Earth Observation Science - | 22 | "affiliation": "Centre for Earth Observation Science - | ||
23 | University of Manitoba", | 23 | University of Manitoba", | ||
24 | "creatorName": "Ausen, Emma", | 24 | "creatorName": "Ausen, Emma", | ||
25 | "email": "emma.ausen@umanitoba.ca", | 25 | "email": "emma.ausen@umanitoba.ca", | ||
26 | "nameIdentifier": "", | 26 | "nameIdentifier": "", | ||
27 | "nameType": "Personal" | 27 | "nameType": "Personal" | ||
28 | }, | 28 | }, | ||
29 | { | 29 | { | ||
30 | "affiliation": "Centre for Earth Observation Science - | 30 | "affiliation": "Centre for Earth Observation Science - | ||
31 | University of Manitoba", | 31 | University of Manitoba", | ||
32 | "creatorName": "Barber, David", | 32 | "creatorName": "Barber, David", | ||
33 | "email": "david.barber@umanitoba.ca", | 33 | "email": "david.barber@umanitoba.ca", | ||
34 | "nameIdentifier": "0000-0001-9466-3291", | 34 | "nameIdentifier": "0000-0001-9466-3291", | ||
35 | "nameIdentifierScheme": "ORCID", | 35 | "nameIdentifierScheme": "ORCID", | ||
36 | "nameType": "Personal", | 36 | "nameType": "Personal", | ||
37 | "schemeURI": "http://orcid.org/" | 37 | "schemeURI": "http://orcid.org/" | ||
38 | } | 38 | } | ||
39 | ], | 39 | ], | ||
40 | "Identifier": "10.1139/juvs-2021-0024", | 40 | "Identifier": "10.1139/juvs-2021-0024", | ||
41 | "PublicationYear": "2022", | 41 | "PublicationYear": "2022", | ||
42 | "Publisher": "Drone Systems and Applications", | 42 | "Publisher": "Drone Systems and Applications", | ||
43 | "ResourceType": "journal article", | 43 | "ResourceType": "journal article", | ||
44 | "Rights": "Creative Commons Attribution 4.0 International", | 44 | "Rights": "Creative Commons Attribution 4.0 International", | ||
45 | "Version": "1.0", | 45 | "Version": "1.0", | ||
46 | "author": null, | 46 | "author": null, | ||
47 | "author_email": null, | 47 | "author_email": null, | ||
48 | "citation": "Madison L.Harasyn, Wayne S.Chan, Emma L.Ausen, and | 48 | "citation": "Madison L.Harasyn, Wayne S.Chan, Emma L.Ausen, and | ||
49 | David G.Barber. Detection and tracking of belugas, kayaks and | 49 | David G.Barber. Detection and tracking of belugas, kayaks and | ||
50 | motorized boats in drone video using deep learning. Drone Systems and | 50 | motorized boats in drone video using deep learning. Drone Systems and | ||
51 | Applications. 10(1): 77-96. https://doi.org/10.1139/juvs-2021-0024", | 51 | Applications. 10(1): 77-96. https://doi.org/10.1139/juvs-2021-0024", | ||
52 | "creator_user_id": "cde7b848-a882-4fc7-97c9-670417bd6b43", | 52 | "creator_user_id": "cde7b848-a882-4fc7-97c9-670417bd6b43", | ||
53 | "descriptionType": "Abstract", | 53 | "descriptionType": "Abstract", | ||
n | 54 | "extras": [ | n | ||
55 | { | ||||
56 | "key": "awardTitle", | ||||
57 | "value": "The Canada Excellence Research Chair (CERC) and the | ||||
58 | Canada Research Chair (CRC programs)" | ||||
59 | }, | ||||
60 | { | ||||
61 | "key": "awardURI", | ||||
62 | "value": "https://www.cerc.gc.ca/" | ||||
63 | }, | ||||
64 | { | ||||
65 | "key": "funderIdentifier", | ||||
66 | "value": "" | ||||
67 | }, | ||||
68 | { | ||||
69 | "key": "funderIdentifierType", | ||||
70 | "value": "" | ||||
71 | }, | ||||
72 | { | ||||
73 | "key": "funderName", | ||||
74 | "value": "" | ||||
75 | }, | ||||
76 | { | ||||
77 | "key": "funderSchemeURI", | ||||
78 | "value": "" | ||||
79 | }, | ||||
80 | { | ||||
81 | "key": "grantNumber", | ||||
82 | "value": "" | ||||
83 | }, | ||||
84 | { | ||||
85 | "key": "keywords", | ||||
86 | "value": "Beluga,Unmanned Aerial Vehicle,computer vision,deep | ||||
87 | learning,object detection,object tracking" | ||||
88 | } | ||||
89 | ], | ||||
90 | "groups": [ | 54 | "groups": [ | ||
91 | { | 55 | { | ||
n | 92 | "description": "Features and characteristics of salt water | n | 56 | "description": "Inland water features, drainage systems and |
93 | bodies.\r\n\r\nIn CEOS, related research themes include | 57 | their characteristics. Examples of data you can find here include | ||
94 | biogeochemistry, modelling, marine mammals, oil spill response, | 58 | river and lake data, water quality data. \r\n\r\nIn CEOS, related | ||
95 | physical oceanography, remote sensing and technology and trace metals | 59 | research themes include biogeochemistry, Inland lakes and waters, | ||
60 | modelling, remote sensing and technology, trace metals and | ||||
96 | and contaminants", | 61 | contaminants.", | ||
97 | "display_name": "Marine", | 62 | "display_name": "Freshwater", | ||
98 | "id": "98238b1c-5be8-41ad-8c6e-74cdc4f5f369", | 63 | "id": "8f8cd877-b037-4b1a-b928-f86d9e093741", | ||
99 | "image_display_url": | 64 | "image_display_url": | ||
n | 100 | ata/uploads/group/2021-10-31-211516.365746ofinspireoceanographic.svg", | n | 65 | /data/uploads/group/2021-10-31-211937.658599hyinspirehydrography.svg", |
101 | "name": "marine", | 66 | "name": "freshwater", | ||
102 | "title": "Marine" | 67 | "title": "Freshwater" | ||
103 | }, | ||||
104 | { | ||||
105 | "description": "Image: \"Earth from Space\" by NASA Goddard | ||||
106 | Photo and Video is licensed under CC BY 2.0", | ||||
107 | "display_name": "Remote Sensing", | ||||
108 | "id": "3ec49cbb-4da6-4fe8-8d54-5b6ce03b49d9", | ||||
109 | "image_display_url": | ||||
110 | anitoba.ca/data/uploads/group/2022-02-05-222621.346712earthimage.jpg", | ||||
111 | "name": "remote-sensing", | ||||
112 | "title": "Remote Sensing" | ||||
113 | } | 68 | } | ||
114 | ], | 69 | ], | ||
115 | "id": "54b0d7a1-8536-4d40-b1bb-daad81805f43", | 70 | "id": "54b0d7a1-8536-4d40-b1bb-daad81805f43", | ||
116 | "isopen": false, | 71 | "isopen": false, | ||
117 | "language": "English", | 72 | "language": "English", | ||
118 | "licenceType": "Open", | 73 | "licenceType": "Open", | ||
119 | "license_id": null, | 74 | "license_id": null, | ||
120 | "license_title": null, | 75 | "license_title": null, | ||
121 | "maintainer": null, | 76 | "maintainer": null, | ||
122 | "maintainer_email": null, | 77 | "maintainer_email": null, | ||
123 | "metadata_created": "2022-04-07T19:45:13.021227", | 78 | "metadata_created": "2022-04-07T19:45:13.021227", | ||
n | 124 | "metadata_modified": "2023-05-18T21:05:56.950941", | n | 79 | "metadata_modified": "2023-06-06T20:09:06.818440", |
125 | "name": "detect-video-deep-learning", | 80 | "name": "detect-video-deep-learning", | ||
126 | "notes": "Aerial imagery surveys are commonly used in marine mammal | 81 | "notes": "Aerial imagery surveys are commonly used in marine mammal | ||
127 | research to determine population size, distribution and habitat use. | 82 | research to determine population size, distribution and habitat use. | ||
128 | Analysis of aerial photos involves hours of manually identifying | 83 | Analysis of aerial photos involves hours of manually identifying | ||
129 | individuals present in each image and converting raw counts into | 84 | individuals present in each image and converting raw counts into | ||
130 | useable biological statistics. Our research proposes the use of deep | 85 | useable biological statistics. Our research proposes the use of deep | ||
131 | learning algorithms to increase the efficiency of the marine mammal | 86 | learning algorithms to increase the efficiency of the marine mammal | ||
132 | research workflow. To test the feasibility of this proposal, the | 87 | research workflow. To test the feasibility of this proposal, the | ||
133 | existing YOLOv4 convolutional neural network model was trained to | 88 | existing YOLOv4 convolutional neural network model was trained to | ||
134 | detect belugas, kayaks and motorized boats in oblique drone imagery, | 89 | detect belugas, kayaks and motorized boats in oblique drone imagery, | ||
135 | collected from a stationary tethered system. Automated computer-based | 90 | collected from a stationary tethered system. Automated computer-based | ||
136 | object detection achieved the following precision and recall, | 91 | object detection achieved the following precision and recall, | ||
137 | respectively, for each class: beluga = 74%/72%; boat = 97%/99%; and | 92 | respectively, for each class: beluga = 74%/72%; boat = 97%/99%; and | ||
138 | kayak = 96%/96%. We then tested the performance of computer vision | 93 | kayak = 96%/96%. We then tested the performance of computer vision | ||
139 | tracking of belugas and occupied watercraft in drone videos using the | 94 | tracking of belugas and occupied watercraft in drone videos using the | ||
140 | DeepSORT tracking algorithm, which achieved a multiple-object tracking | 95 | DeepSORT tracking algorithm, which achieved a multiple-object tracking | ||
141 | accuracy (MOTA) ranging from 37% to 88% and multiple object tracking | 96 | accuracy (MOTA) ranging from 37% to 88% and multiple object tracking | ||
142 | precision (MOTP) between 63% and 86%. Results from this research | 97 | precision (MOTP) between 63% and 86%. Results from this research | ||
143 | indicate that deep learning technology can detect and track features | 98 | indicate that deep learning technology can detect and track features | ||
144 | more consistently than human annotators, allowing for larger datasets | 99 | more consistently than human annotators, allowing for larger datasets | ||
145 | to be processed within a fraction of the time while avoiding | 100 | to be processed within a fraction of the time while avoiding | ||
146 | discrepancies introduced by labeling fatigue or multiple human | 101 | discrepancies introduced by labeling fatigue or multiple human | ||
t | 147 | annotators.", | t | 102 | annotators.\r\n\r\nR\u00e9sum\u00e9 Les relev\u00e9s par imagerie |
103 | a\u00e9rienne sont couramment utilis\u00e9s dans la recherche sur les | ||||
104 | mammif\u00e8res marins pour d\u00e9terminer la taille de la | ||||
105 | population, sa r\u00e9partition et l\u2019utilisation de | ||||
106 | l\u2019habitat. L\u2019analyse des photos a\u00e9riennes implique des | ||||
107 | heures d\u2019identification manuelle des individus pr\u00e9sents dans | ||||
108 | chaque image et la conversion des chiffres bruts en statistiques | ||||
109 | biologiques utilisables. Notre recherche propose l\u2019utilisation | ||||
110 | d\u2019algorithmes d\u2019apprentissage en profondeur pour augmenter | ||||
111 | l\u2019efficacit\u00e9 du flux de recherche sur les mammif\u00e8res | ||||
112 | marins. Pour mettre \u00e0 l\u2019essai la faisabilit\u00e9 de cette | ||||
113 | proposition, le mod\u00e8le de r\u00e9seau de neurones \u00e0 | ||||
114 | convolution YOLOv4 existant a \u00e9t\u00e9 entra\u00een\u00e9 pour | ||||
115 | d\u00e9tecter les b\u00e9lugas, les kayaks et les embarcations | ||||
116 | motoris\u00e9es dans des images de drones obliques, recueillies \u00e0 | ||||
117 | partir d\u2019un syst\u00e8me fixe reli\u00e9. La d\u00e9tection | ||||
118 | automatis\u00e9e d\u2019objets par ordinateur a atteint la | ||||
119 | pr\u00e9cision et le rappel suivants, respectivement, pour chaque | ||||
120 | classe : b\u00e9luga : 74 %/72 %; bateau : 97 %/99 %; kayak : 96 %/96 | ||||
121 | %. Les auteurs ont ensuite test\u00e9 la performance de poursuite au | ||||
122 | moyen de la vision par ordinateur des b\u00e9lugas et des motomarines | ||||
123 | dans des vid\u00e9os de drones \u00e0 l\u2019aide de l\u2019algorithme | ||||
124 | de poursuite DeepSORT, qui a obtenu une exactitude de poursuite des | ||||
125 | objets multiples (\u00ab MOTA \u00bb) allant de 37 \u00e0 88 % et une | ||||
126 | pr\u00e9cision de poursuite des objets multiples (\u00ab MOTP \u00bb) | ||||
127 | allant de 63 \u00e0 86 %. Les r\u00e9sultats de cette recherche | ||||
128 | indiquent que la technologie d\u2019apprentissage profond peut | ||||
129 | d\u00e9tecter et suivre les caract\u00e9ristiques plus | ||||
130 | r\u00e9guli\u00e8rement que les annotateurs humains, permettant de | ||||
131 | traiter des ensembles de donn\u00e9es plus volumineux en une fraction | ||||
132 | de temps tout en \u00e9vitant les \u00e9carts introduits par la | ||||
133 | fatigue d\u2019\u00e9tiquetage ou de multiples annotateurs humains. | ||||
134 | [Traduit par la R\u00e9daction]", | ||||
148 | "num_resources": 2, | 135 | "num_resources": 2, | ||
149 | "num_tags": 6, | 136 | "num_tags": 6, | ||
150 | "organization": { | 137 | "organization": { | ||
151 | "approval_status": "approved", | 138 | "approval_status": "approved", | ||
152 | "created": "2017-07-21T13:15:49.935872", | 139 | "created": "2017-07-21T13:15:49.935872", | ||
153 | "description": "The Centre for Earth Observation Science (CEOS) | 140 | "description": "The Centre for Earth Observation Science (CEOS) | ||
154 | was established in 1994 with a mandate to research, preserve and | 141 | was established in 1994 with a mandate to research, preserve and | ||
155 | communicate knowledge of Earth system processes using the technologies | 142 | communicate knowledge of Earth system processes using the technologies | ||
156 | of Earth Observation Science. Research is multidisciplinary and | 143 | of Earth Observation Science. Research is multidisciplinary and | ||
157 | collaborative seeking to understand the complex interrelationships | 144 | collaborative seeking to understand the complex interrelationships | ||
158 | between elements of Earth systems, and how these systems will likely | 145 | between elements of Earth systems, and how these systems will likely | ||
159 | respond to climate change. Although researchers have worked in many | 146 | respond to climate change. Although researchers have worked in many | ||
160 | regions, the Arctic marine system has always been a unifying focus of | 147 | regions, the Arctic marine system has always been a unifying focus of | ||
161 | activity.\r\n\r\nIn 2012, CEOS, along with the Greenland Climate | 148 | activity.\r\n\r\nIn 2012, CEOS, along with the Greenland Climate | ||
162 | Research Centre (GCRC, Nuuk, Greenland) and the Arctic Research Centre | 149 | Research Centre (GCRC, Nuuk, Greenland) and the Arctic Research Centre | ||
163 | (ARC, Aarhus, Denmark) established the Arctic Science Partnership, | 150 | (ARC, Aarhus, Denmark) established the Arctic Science Partnership, | ||
164 | thereby integrating academic and research initiatives.\r\n\r\nAreas of | 151 | thereby integrating academic and research initiatives.\r\n\r\nAreas of | ||
165 | existing research activity are divided among key themes:\r\n\r\nArctic | 152 | existing research activity are divided among key themes:\r\n\r\nArctic | ||
166 | Anthropology/Paleoclimatology: LiDAR scanning and digital site | 153 | Anthropology/Paleoclimatology: LiDAR scanning and digital site | ||
167 | preservation, archaeo-geophysics, permafrost degredation, lithic | 154 | preservation, archaeo-geophysics, permafrost degredation, lithic | ||
168 | morphometrics, zooarchaeology, proxy studies, paleodistribution of sea | 155 | morphometrics, zooarchaeology, proxy studies, paleodistribution of sea | ||
169 | ice, landscape learning, Paleo-Eskimo culture, Thule Inuit culture, | 156 | ice, landscape learning, Paleo-Eskimo culture, Thule Inuit culture, | ||
170 | ethnographic analogy, traditional knowledge, climate change and | 157 | ethnographic analogy, traditional knowledge, climate change and | ||
171 | northern heritage resource management.\r\n\r\nAtmospheric | 158 | northern heritage resource management.\r\n\r\nAtmospheric | ||
172 | Studies/Meteorology: Boundary layer, precipitation, clouds, storms and | 159 | Studies/Meteorology: Boundary layer, precipitation, clouds, storms and | ||
173 | extreme weather, circulation, eddy correlations, polar vortex, | 160 | extreme weather, circulation, eddy correlations, polar vortex, | ||
174 | climate, teleconnections, geophysical fluid dynamics, flux and energy | 161 | climate, teleconnections, geophysical fluid dynamics, flux and energy | ||
175 | budgets, ocean-sea ice-atmosphere interface, radiative transfer, ice | 162 | budgets, ocean-sea ice-atmosphere interface, radiative transfer, ice | ||
176 | albedo feedback, cloud radiative forcing, pCO2. | 163 | albedo feedback, cloud radiative forcing, pCO2. | ||
177 | \r\n\r\nBiogeochemistry: Organic carbon, greenhouse gases, bubbles, | 164 | \r\n\r\nBiogeochemistry: Organic carbon, greenhouse gases, bubbles, | ||
178 | Ikaite, carbonate chemistry, CO2 fluxes, mercury and other trace | 165 | Ikaite, carbonate chemistry, CO2 fluxes, mercury and other trace | ||
179 | metals, minerals, hydrocarbons, brine processes, otolith | 166 | metals, minerals, hydrocarbons, brine processes, otolith | ||
180 | microchemistry, sediments, biomarkers. \r\n\r\nContaminants: Mercury, | 167 | microchemistry, sediments, biomarkers. \r\n\r\nContaminants: Mercury, | ||
181 | trace metals, PAHs, source, transport, transformation, pathways, | 168 | trace metals, PAHs, source, transport, transformation, pathways, | ||
182 | bioaccumulations, marine ecosystems, marine chemistry. \r\nEarth | 169 | bioaccumulations, marine ecosystems, marine chemistry. \r\nEarth | ||
183 | Observation Science: Active and passive microwave, LiDAR, EM | 170 | Observation Science: Active and passive microwave, LiDAR, EM | ||
184 | induction, spatial-temporal analysis, forward and inverse scattering | 171 | induction, spatial-temporal analysis, forward and inverse scattering | ||
185 | models, complex permittivity, ocean colour, ocean surface roughness, | 172 | models, complex permittivity, ocean colour, ocean surface roughness, | ||
186 | NIR, TIR, satellite telemetry, GPS. Ice-Associated Biology: | 173 | NIR, TIR, satellite telemetry, GPS. Ice-Associated Biology: | ||
187 | Biophysical processes, primary production; ice algae, ice | 174 | Biophysical processes, primary production; ice algae, ice | ||
188 | microbiology, bio-optics, under-ice phytoplankton. \r\n\r\nInland | 175 | microbiology, bio-optics, under-ice phytoplankton. \r\n\r\nInland | ||
189 | Lakes and Waters: Hydrologic connectivity, watershed systems, sediment | 176 | Lakes and Waters: Hydrologic connectivity, watershed systems, sediment | ||
190 | transport, nutrient transport, contaminants, landscape processes, | 177 | transport, nutrient transport, contaminants, landscape processes, | ||
191 | remote sensing, freshwater-marine coupling. Marine Mammals: Seals, | 178 | remote sensing, freshwater-marine coupling. Marine Mammals: Seals, | ||
192 | whales, habitat, conservation, satellite telemetry, distribution, | 179 | whales, habitat, conservation, satellite telemetry, distribution, | ||
193 | population studies, prey behaviour, bioacoustics.\r\n\r\nModelling: | 180 | population studies, prey behaviour, bioacoustics.\r\n\r\nModelling: | ||
194 | Simulation of sea ice and oceanic regional processes, Nucleus for | 181 | Simulation of sea ice and oceanic regional processes, Nucleus for | ||
195 | European Modelling of the Ocean (NEMO), ice-ocean modelling and | 182 | European Modelling of the Ocean (NEMO), ice-ocean modelling and | ||
196 | interactions, hind cast simulations and projections for sea ice state | 183 | interactions, hind cast simulations and projections for sea ice state | ||
197 | and ocean variables based on CMIP5 scenarios and MIROC5 forcing, | 184 | and ocean variables based on CMIP5 scenarios and MIROC5 forcing, | ||
198 | validation.\r\n\r\nOceanography: Circulation, temperature, in-flow and | 185 | validation.\r\n\r\nOceanography: Circulation, temperature, in-flow and | ||
199 | out-flow shelves, water dynamics, microturbulence, Beaufort Gyre, eddy | 186 | out-flow shelves, water dynamics, microturbulence, Beaufort Gyre, eddy | ||
200 | correlations.\r\n\r\nSea Ice Geophysics:Thermodynamic and dynamic | 187 | correlations.\r\n\r\nSea Ice Geophysics:Thermodynamic and dynamic | ||
201 | processes, extreme ice features and hazards, snow, ridges, | 188 | processes, extreme ice features and hazards, snow, ridges, | ||
202 | polynyas.\r\n\r\nTraditional and Local Knowledge: Indigenous cultures, | 189 | polynyas.\r\n\r\nTraditional and Local Knowledge: Indigenous cultures, | ||
203 | Inuit, Inuvialuit, oral history, toponomy, mobility and settlement, | 190 | Inuit, Inuvialuit, oral history, toponomy, mobility and settlement, | ||
204 | hunting, food security, sea ice use, community-based research, | 191 | hunting, food security, sea ice use, community-based research, | ||
205 | community-based monitoring, two ways of knowing.", | 192 | community-based monitoring, two ways of knowing.", | ||
206 | "id": "9e21f6b6-d13f-4ba2-a379-fd962f507071", | 193 | "id": "9e21f6b6-d13f-4ba2-a379-fd962f507071", | ||
207 | "image_url": "2021-11-13-003953.952874UMLogoHORZ.jpg", | 194 | "image_url": "2021-11-13-003953.952874UMLogoHORZ.jpg", | ||
208 | "is_organization": true, | 195 | "is_organization": true, | ||
209 | "name": "ceos", | 196 | "name": "ceos", | ||
210 | "state": "active", | 197 | "state": "active", | ||
211 | "title": "Centre for Earth Observation Science", | 198 | "title": "Centre for Earth Observation Science", | ||
212 | "type": "organization" | 199 | "type": "organization" | ||
213 | }, | 200 | }, | ||
214 | "owner_org": "9e21f6b6-d13f-4ba2-a379-fd962f507071", | 201 | "owner_org": "9e21f6b6-d13f-4ba2-a379-fd962f507071", | ||
215 | "private": false, | 202 | "private": false, | ||
216 | "related_datasets": [ | 203 | "related_datasets": [ | ||
217 | "b5f259b4-3ace-4750-bfb0-47c4e794082f" | 204 | "b5f259b4-3ace-4750-bfb0-47c4e794082f" | ||
218 | ], | 205 | ], | ||
219 | "related_programs": [], | 206 | "related_programs": [], | ||
220 | "relationships_as_object": [], | 207 | "relationships_as_object": [], | ||
221 | "relationships_as_subject": [], | 208 | "relationships_as_subject": [], | ||
222 | "resources": [ | 209 | "resources": [ | ||
223 | { | 210 | { | ||
224 | "cache_last_updated": null, | 211 | "cache_last_updated": null, | ||
225 | "cache_url": null, | 212 | "cache_url": null, | ||
226 | "created": "2022-04-07T19:49:13.974750", | 213 | "created": "2022-04-07T19:49:13.974750", | ||
227 | "datastore_active": false, | 214 | "datastore_active": false, | ||
228 | "datastore_contains_all_records_of_source_file": false, | 215 | "datastore_contains_all_records_of_source_file": false, | ||
229 | "description": "Churchill Beluga Boat Drone Imagery related | 216 | "description": "Churchill Beluga Boat Drone Imagery related | ||
230 | journal article published in Drone Systems and Applications.\r\nDOI: | 217 | journal article published in Drone Systems and Applications.\r\nDOI: | ||
231 | https://doi.org/10.1139/juvs-2021-0024", | 218 | https://doi.org/10.1139/juvs-2021-0024", | ||
232 | "format": "PDF", | 219 | "format": "PDF", | ||
233 | "hash": "", | 220 | "hash": "", | ||
234 | "id": "5bcbb0bc-425b-4fad-b7ff-4c8599043dcf", | 221 | "id": "5bcbb0bc-425b-4fad-b7ff-4c8599043dcf", | ||
235 | "last_modified": "2022-04-07T20:02:20.594051", | 222 | "last_modified": "2022-04-07T20:02:20.594051", | ||
236 | "metadata_modified": "2023-05-18T21:05:56.975275", | 223 | "metadata_modified": "2023-05-18T21:05:56.975275", | ||
237 | "mimetype": "application/pdf", | 224 | "mimetype": "application/pdf", | ||
238 | "mimetype_inner": null, | 225 | "mimetype_inner": null, | ||
239 | "name": "Detection and tracking of belugas, kayaks and motorized | 226 | "name": "Detection and tracking of belugas, kayaks and motorized | ||
240 | boats in drone video using deep learning", | 227 | boats in drone video using deep learning", | ||
241 | "package_id": "54b0d7a1-8536-4d40-b1bb-daad81805f43", | 228 | "package_id": "54b0d7a1-8536-4d40-b1bb-daad81805f43", | ||
242 | "position": 0, | 229 | "position": 0, | ||
243 | "resCategory": "supplemental", | 230 | "resCategory": "supplemental", | ||
244 | "resource_type": null, | 231 | "resource_type": null, | ||
245 | "size": 4104522, | 232 | "size": 4104522, | ||
246 | "state": "active", | 233 | "state": "active", | ||
247 | "url": | 234 | "url": | ||
248 | rce/5bcbb0bc-425b-4fad-b7ff-4c8599043dcf/download/juvs-2021-0024.pdf", | 235 | rce/5bcbb0bc-425b-4fad-b7ff-4c8599043dcf/download/juvs-2021-0024.pdf", | ||
249 | "url_type": "upload" | 236 | "url_type": "upload" | ||
250 | }, | 237 | }, | ||
251 | { | 238 | { | ||
252 | "cache_last_updated": null, | 239 | "cache_last_updated": null, | ||
253 | "cache_url": null, | 240 | "cache_url": null, | ||
254 | "created": "2023-05-18T21:05:57.006643", | 241 | "created": "2023-05-18T21:05:57.006643", | ||
255 | "datastore_active": false, | 242 | "datastore_active": false, | ||
256 | "datastore_contains_all_records_of_source_file": false, | 243 | "datastore_contains_all_records_of_source_file": false, | ||
257 | "description": "Researchers at CEOS are often asked to write a | 244 | "description": "Researchers at CEOS are often asked to write a | ||
258 | field story about their work, to make their research more accessible. | 245 | field story about their work, to make their research more accessible. | ||
259 | We decided to do something a little different for our work on applying | 246 | We decided to do something a little different for our work on applying | ||
260 | machine learning to detecting and tracking beluga whales: we are | 247 | machine learning to detecting and tracking beluga whales: we are | ||
261 | presenting it as a comic-book style video!", | 248 | presenting it as a comic-book style video!", | ||
262 | "format": "", | 249 | "format": "", | ||
263 | "hash": "", | 250 | "hash": "", | ||
264 | "id": "1cd6dbae-5c9d-440d-b29b-26c84fbc5a7c", | 251 | "id": "1cd6dbae-5c9d-440d-b29b-26c84fbc5a7c", | ||
265 | "last_modified": null, | 252 | "last_modified": null, | ||
266 | "metadata_modified": "2023-05-18T21:05:56.975546", | 253 | "metadata_modified": "2023-05-18T21:05:56.975546", | ||
267 | "mimetype": null, | 254 | "mimetype": null, | ||
268 | "mimetype_inner": null, | 255 | "mimetype_inner": null, | ||
269 | "name": "One Beluga, Two Beluga, Three Beluga, Four: How to | 256 | "name": "One Beluga, Two Beluga, Three Beluga, Four: How to | ||
270 | Count Belugas When You Run Out of Fingers and Toes", | 257 | Count Belugas When You Run Out of Fingers and Toes", | ||
271 | "package_id": "54b0d7a1-8536-4d40-b1bb-daad81805f43", | 258 | "package_id": "54b0d7a1-8536-4d40-b1bb-daad81805f43", | ||
272 | "position": 1, | 259 | "position": 1, | ||
273 | "resCategory": "supplemental", | 260 | "resCategory": "supplemental", | ||
274 | "resource_type": null, | 261 | "resource_type": null, | ||
275 | "size": null, | 262 | "size": null, | ||
276 | "state": "active", | 263 | "state": "active", | ||
277 | "url": | 264 | "url": | ||
278 | n/beluga-graphic-novel/resource/58aed159-4a62-4c2b-9978-967ad5f356a6", | 265 | n/beluga-graphic-novel/resource/58aed159-4a62-4c2b-9978-967ad5f356a6", | ||
279 | "url_type": null | 266 | "url_type": null | ||
280 | } | 267 | } | ||
281 | ], | 268 | ], | ||
282 | "rightsIdentifier": "CC-BY-4.0", | 269 | "rightsIdentifier": "CC-BY-4.0", | ||
283 | "rightsIdentifierScheme": "SPDX", | 270 | "rightsIdentifierScheme": "SPDX", | ||
284 | "rightsSchemeURI": "https://spdx.org/licenses", | 271 | "rightsSchemeURI": "https://spdx.org/licenses", | ||
285 | "rightsURI": "https://spdx.org/licenses/CC-BY-4.0.html", | 272 | "rightsURI": "https://spdx.org/licenses/CC-BY-4.0.html", | ||
286 | "schemeURI": "", | 273 | "schemeURI": "", | ||
287 | "state": "active", | 274 | "state": "active", | ||
288 | "subjectScheme": "", | 275 | "subjectScheme": "", | ||
289 | "tags": [ | 276 | "tags": [ | ||
290 | { | 277 | { | ||
291 | "display_name": "Beluga", | 278 | "display_name": "Beluga", | ||
292 | "id": "a9f25a89-b0ef-4d4d-993d-73f28e0d702a", | 279 | "id": "a9f25a89-b0ef-4d4d-993d-73f28e0d702a", | ||
293 | "name": "Beluga", | 280 | "name": "Beluga", | ||
294 | "state": "active", | 281 | "state": "active", | ||
295 | "vocabulary_id": null | 282 | "vocabulary_id": null | ||
296 | }, | 283 | }, | ||
297 | { | 284 | { | ||
298 | "display_name": "Unmanned Aerial Vehicle", | 285 | "display_name": "Unmanned Aerial Vehicle", | ||
299 | "id": "a6dc9001-e6da-4a84-bfec-2941d3ebce78", | 286 | "id": "a6dc9001-e6da-4a84-bfec-2941d3ebce78", | ||
300 | "name": "Unmanned Aerial Vehicle", | 287 | "name": "Unmanned Aerial Vehicle", | ||
301 | "state": "active", | 288 | "state": "active", | ||
302 | "vocabulary_id": null | 289 | "vocabulary_id": null | ||
303 | }, | 290 | }, | ||
304 | { | 291 | { | ||
305 | "display_name": "computer vision", | 292 | "display_name": "computer vision", | ||
306 | "id": "d7270905-c420-4d19-aa9c-c6f818ab5b67", | 293 | "id": "d7270905-c420-4d19-aa9c-c6f818ab5b67", | ||
307 | "name": "computer vision", | 294 | "name": "computer vision", | ||
308 | "state": "active", | 295 | "state": "active", | ||
309 | "vocabulary_id": null | 296 | "vocabulary_id": null | ||
310 | }, | 297 | }, | ||
311 | { | 298 | { | ||
312 | "display_name": "deep learning", | 299 | "display_name": "deep learning", | ||
313 | "id": "87526358-2d8a-4c78-8375-38c132b53d5a", | 300 | "id": "87526358-2d8a-4c78-8375-38c132b53d5a", | ||
314 | "name": "deep learning", | 301 | "name": "deep learning", | ||
315 | "state": "active", | 302 | "state": "active", | ||
316 | "vocabulary_id": null | 303 | "vocabulary_id": null | ||
317 | }, | 304 | }, | ||
318 | { | 305 | { | ||
319 | "display_name": "object detection", | 306 | "display_name": "object detection", | ||
320 | "id": "a3d44586-cba5-4685-b07a-2d2f16578353", | 307 | "id": "a3d44586-cba5-4685-b07a-2d2f16578353", | ||
321 | "name": "object detection", | 308 | "name": "object detection", | ||
322 | "state": "active", | 309 | "state": "active", | ||
323 | "vocabulary_id": null | 310 | "vocabulary_id": null | ||
324 | }, | 311 | }, | ||
325 | { | 312 | { | ||
326 | "display_name": "object tracking", | 313 | "display_name": "object tracking", | ||
327 | "id": "28ce0864-2ed8-43d1-b80d-d79684cac63f", | 314 | "id": "28ce0864-2ed8-43d1-b80d-d79684cac63f", | ||
328 | "name": "object tracking", | 315 | "name": "object tracking", | ||
329 | "state": "active", | 316 | "state": "active", | ||
330 | "vocabulary_id": null | 317 | "vocabulary_id": null | ||
331 | } | 318 | } | ||
332 | ], | 319 | ], | ||
333 | "theme": [ | 320 | "theme": [ | ||
334 | "8f8cd877-b037-4b1a-b928-f86d9e093741", | 321 | "8f8cd877-b037-4b1a-b928-f86d9e093741", | ||
335 | "98238b1c-5be8-41ad-8c6e-74cdc4f5f369", | 322 | "98238b1c-5be8-41ad-8c6e-74cdc4f5f369", | ||
336 | "3ec49cbb-4da6-4fe8-8d54-5b6ce03b49d9" | 323 | "3ec49cbb-4da6-4fe8-8d54-5b6ce03b49d9" | ||
337 | ], | 324 | ], | ||
338 | "title": "Detection and tracking of belugas, kayaks and motorized | 325 | "title": "Detection and tracking of belugas, kayaks and motorized | ||
339 | boats in drone video using deep learning", | 326 | boats in drone video using deep learning", | ||
340 | "type": "publication", | 327 | "type": "publication", | ||
341 | "url": null, | 328 | "url": null, | ||
342 | "version": null | 329 | "version": null | ||
343 | } | 330 | } |