際際滷

際際滷Share a Scribd company logo
A year of validation of the
StormTrack algorithm over
different European regions
Michele de Rosa1, Matteo Picchiani1,2, Massimiliano Sist1,2,
Fabio Del Frate2
1Geo-K s.r.l., via del Politecnico, Rome, Italy
2Tor Vergata University, via del Politecnico, Rome, Italy
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Questions we will try to answer
 What is StormTrack?
 What is its coverage?
 Whats the validation goal?
 What ground data for validation?
 How making validation?
 How measuring the performance?
 What is the validation process?
 What are the performance?
 Good or Bad?
 Whats next?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What is StormTrack?
 An algorithm for the monitoring of the convective objects
 MSG as unique data source
 (Early) Detection of the convective objects
 Tracking of the detected objects
 Temporal and spatial extrapolation of the detected objects from 15 to
30 minutes ahead
 High computation efficiency and reliability (few minutes to run over
the whole MSG disk)
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What is the StormTrack?
 Different components
 Objects detector (ODT): objects identification & props
 Objects tracker (OTK): temporal relationships
 Objects Nowcasting Engine (ONE): temporal and spatial extrapolation
ODT
OTK
ONE Nowcasting
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Processing Layer
Data Access Layer
Database
APIs
Archive
DataExtractionLayer
What is the StormTrack?
 Use the channels 5,6 and 9
 BTD6,9 cloud middle layer
detection (early detection)
 BTD5,9: cloud top detection
(Schmetz,1997; Aumann, 2009 )
 Connected components
 Object definition (properties)
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What is its coverage?
 Over the whole MSG disk (system active at our partner headquarters
Datameteo)
 Rapid Update version at our headquarters (StormTrackRU)
 NWP data assimilation active over Europe, Northern Africa till the
Equator and Southern Africa (StormTrackNWP)
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Whats the validation goal?
 EU validation area
 10 sub regions
 Validation period
 22th June 2015  13th Jul 2016
 Loss of data during Dec 2015-Jan
2016 (reception station issues)
 Dec 2015 excluded from validation
(offline reprocessing needed)
 About 250 days
 More than 20000 analysed slots
 Not just case studies!
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What ground data for validation?
 ATDNet lightning data
 Strokes 5 mins before, 10 mins after MSG slot time
 Expected location accuracy
 United Kingdom = 1.0-3.0 km
 Western Europe = 2.0-5.0 km
 Eastern Europe = 2.0-10 km
 Eastern Atlantic = 10-15 km
 Central Africa = approximately 20 km
 South America = approximately 30-50 km
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Source MetOffice
How making validation?
 MET is a set of verification tools developed by the Developmental Testbed
Center (DTC) for use by the numerical weather prediction community to
help them assess and evaluate the performance of numerical weather
predictions.
 The primary goal of MET development is to provide a state-of-the-art
verification package to the NWP community. By state-of-the-art it means
that MET will incorporate newly developed and advanced verification
methodologies, including new methods for diagnostic and spatial
verification and new techniques provided by the verification and modeling
communities.
 Several tools are part of the MET package and the MODE (object oriented
validation) tool has been chosen for the validation of the StormTrack
algorithm.
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
How measuring the performance?
 Probability of detection (POD)
 =
11
11+01
=
11
.1
 False alarm ratio (FAR)
告基 =
10
11+10
=
10
1.
Forecast Observation Total
O=1 (e.g. Yes) O=0 (e.g. No)
f = 1 (e.g., Yes) n11 n10 n1. = n11 + n10
f = 0 (e.g., No) n01 n10 n0.= n01 + n00
Total n.1 = n11 + n01 n.0 = n10 + n00 T = n11 + n10 +
n01 + n00
Most suitable for
our purposes!!!!
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What is the validation process?
StormTrack
Archive
Lightning
Archive
Regridding
(regular 0.25属)
MODE
(MET tool)
Statistics
Archive
Stats viewer
(Excel, Matlab, R,
web, etc.)
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
StormTrack
The ground truth
The validation tool
 Scores over
 EU
 Italy
 Central Europe
 Someone interested in scores on the other regions? Just contact me
after the presentation!
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
15 Strokes: POD>=FAR
# Strokes >=15
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
1 Stroke: POD>=FAR
# Strokes >=1
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
2 Strokes: POD>=FAR
# Strokes>=2
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Average POD Average FAR Region
0,61 0,72 ALPS
0,49 0,64 BALTIC, UKRAINE, MOLDOVA
0,55 0,71 CENTRAL EUROPE
0,50 0,64 EASTERN EUROPE
0,53 0,63 IBERIAN PENINSULA
0,53 0,68 EUROPE
0,57 0,70 FRANCE
0,51 0,59 GREECE & ALBANIA
0,55 0,55 ITALY
0,51 0,65 BALKANS
0,37 0,92 NORTHERN EUROPE
# of Strokes >= 0
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
What are the performance?
Average POD Average FAR Region
0,89 0,14 ALPS
0,79 0,16 BALTIC, UKRAINE, MOLDOVA
0,82 0,19 CENTRAL EUROPE
0,72 0,20 EASTERN EUROPE
0,84 0,10 IBERIAN PENINSULA
0,64 0,44 EUROPE
0,83 0,21 FRANCE
0,77 0,14 GREECE & ALBANIA
0,75 0,17 ITALY
0,74 0,12 BALKANS
0,78 0,24 NORTHERN EUROPE
# of Strokes >= 15
(EU threshold)
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Good or Bad?
 StormTrack is a novel algorithm for storm identification, tracking and
nowcasting working on the whole MSG disk (operative!)
 Data access through APIs
 Rigorous validation procedure using MET (able to measure)
 Extensive validation on about one year (and it is continuing..)
 Characterisation of the accuracy over Europe and over 10 sub regions
 In general FAR>=POD but  most of FARs happen under lack of strokes =>
suitable under moderate or heavy convection
 Good scores over Italy and Alps
 Worst scores over Northern Europe probably due to the lack of lightnings
activity and lower sat resolution
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Good or Bad depends on
several conditions but we
are able to better
understand them
Whats next?
 Continue the validation to enrich the statistics archive (add expected POD
and FAR to the objects)
 Extend the validation to other countries (Equatorial band of Africa)
 Use RDT as benchmark (need data from a reference NWCSAF system)
 StormTrackRU and StormTrackNWP validation
 Enrich the APIs and release StormTrek app (now in beta) under Android
and iOS
 Extend the nowcasting to 60 mins (StormCast module)
 Extend the coverage using other sats (Himawari, MSG1 IODC, GOES)
 Apply the algorithm to other kind of data (NEXRAD)
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Acknoledgements
 COMET for the support
 The Datameteo company
 Massimiliano Sist, Matteo Picchiani, prof. Fabio Del Frate
 Cecilia Marcos Martin (AEMET) for inspiring me to use MET
 Eumetsat WMS team
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
Thanks for your attention
Questions?
michele.derosa@geok.it
www.geo-k.co
Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016

More Related Content

A year of validation of the storm track algorithm over different european regions

  • 1. A year of validation of the StormTrack algorithm over different European regions Michele de Rosa1, Matteo Picchiani1,2, Massimiliano Sist1,2, Fabio Del Frate2 1Geo-K s.r.l., via del Politecnico, Rome, Italy 2Tor Vergata University, via del Politecnico, Rome, Italy Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 2. Questions we will try to answer What is StormTrack? What is its coverage? Whats the validation goal? What ground data for validation? How making validation? How measuring the performance? What is the validation process? What are the performance? Good or Bad? Whats next? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 3. What is StormTrack? An algorithm for the monitoring of the convective objects MSG as unique data source (Early) Detection of the convective objects Tracking of the detected objects Temporal and spatial extrapolation of the detected objects from 15 to 30 minutes ahead High computation efficiency and reliability (few minutes to run over the whole MSG disk) Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 4. What is the StormTrack? Different components Objects detector (ODT): objects identification & props Objects tracker (OTK): temporal relationships Objects Nowcasting Engine (ONE): temporal and spatial extrapolation ODT OTK ONE Nowcasting Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016 Processing Layer Data Access Layer Database APIs Archive DataExtractionLayer
  • 5. What is the StormTrack? Use the channels 5,6 and 9 BTD6,9 cloud middle layer detection (early detection) BTD5,9: cloud top detection (Schmetz,1997; Aumann, 2009 ) Connected components Object definition (properties) Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 6. What is its coverage? Over the whole MSG disk (system active at our partner headquarters Datameteo) Rapid Update version at our headquarters (StormTrackRU) NWP data assimilation active over Europe, Northern Africa till the Equator and Southern Africa (StormTrackNWP) Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 7. Whats the validation goal? EU validation area 10 sub regions Validation period 22th June 2015 13th Jul 2016 Loss of data during Dec 2015-Jan 2016 (reception station issues) Dec 2015 excluded from validation (offline reprocessing needed) About 250 days More than 20000 analysed slots Not just case studies! Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 8. What ground data for validation? ATDNet lightning data Strokes 5 mins before, 10 mins after MSG slot time Expected location accuracy United Kingdom = 1.0-3.0 km Western Europe = 2.0-5.0 km Eastern Europe = 2.0-10 km Eastern Atlantic = 10-15 km Central Africa = approximately 20 km South America = approximately 30-50 km Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016 Source MetOffice
  • 9. How making validation? MET is a set of verification tools developed by the Developmental Testbed Center (DTC) for use by the numerical weather prediction community to help them assess and evaluate the performance of numerical weather predictions. The primary goal of MET development is to provide a state-of-the-art verification package to the NWP community. By state-of-the-art it means that MET will incorporate newly developed and advanced verification methodologies, including new methods for diagnostic and spatial verification and new techniques provided by the verification and modeling communities. Several tools are part of the MET package and the MODE (object oriented validation) tool has been chosen for the validation of the StormTrack algorithm. Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 10. How measuring the performance? Probability of detection (POD) = 11 11+01 = 11 .1 False alarm ratio (FAR) 告基 = 10 11+10 = 10 1. Forecast Observation Total O=1 (e.g. Yes) O=0 (e.g. No) f = 1 (e.g., Yes) n11 n10 n1. = n11 + n10 f = 0 (e.g., No) n01 n10 n0.= n01 + n00 Total n.1 = n11 + n01 n.0 = n10 + n00 T = n11 + n10 + n01 + n00 Most suitable for our purposes!!!! Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 11. What is the validation process? StormTrack Archive Lightning Archive Regridding (regular 0.25属) MODE (MET tool) Statistics Archive Stats viewer (Excel, Matlab, R, web, etc.) Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016 StormTrack The ground truth The validation tool
  • 12. Scores over EU Italy Central Europe Someone interested in scores on the other regions? Just contact me after the presentation! What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 13. What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 14. What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 15. What are the performance? 15 Strokes: POD>=FAR # Strokes >=15 Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 16. What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 17. What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 18. What are the performance? 1 Stroke: POD>=FAR # Strokes >=1 Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 19. What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 20. What are the performance? Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 21. What are the performance? 2 Strokes: POD>=FAR # Strokes>=2 Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 22. What are the performance? Average POD Average FAR Region 0,61 0,72 ALPS 0,49 0,64 BALTIC, UKRAINE, MOLDOVA 0,55 0,71 CENTRAL EUROPE 0,50 0,64 EASTERN EUROPE 0,53 0,63 IBERIAN PENINSULA 0,53 0,68 EUROPE 0,57 0,70 FRANCE 0,51 0,59 GREECE & ALBANIA 0,55 0,55 ITALY 0,51 0,65 BALKANS 0,37 0,92 NORTHERN EUROPE # of Strokes >= 0 Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 23. What are the performance? Average POD Average FAR Region 0,89 0,14 ALPS 0,79 0,16 BALTIC, UKRAINE, MOLDOVA 0,82 0,19 CENTRAL EUROPE 0,72 0,20 EASTERN EUROPE 0,84 0,10 IBERIAN PENINSULA 0,64 0,44 EUROPE 0,83 0,21 FRANCE 0,77 0,14 GREECE & ALBANIA 0,75 0,17 ITALY 0,74 0,12 BALKANS 0,78 0,24 NORTHERN EUROPE # of Strokes >= 15 (EU threshold) Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 24. Good or Bad? StormTrack is a novel algorithm for storm identification, tracking and nowcasting working on the whole MSG disk (operative!) Data access through APIs Rigorous validation procedure using MET (able to measure) Extensive validation on about one year (and it is continuing..) Characterisation of the accuracy over Europe and over 10 sub regions In general FAR>=POD but most of FARs happen under lack of strokes => suitable under moderate or heavy convection Good scores over Italy and Alps Worst scores over Northern Europe probably due to the lack of lightnings activity and lower sat resolution Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016 Good or Bad depends on several conditions but we are able to better understand them
  • 25. Whats next? Continue the validation to enrich the statistics archive (add expected POD and FAR to the objects) Extend the validation to other countries (Equatorial band of Africa) Use RDT as benchmark (need data from a reference NWCSAF system) StormTrackRU and StormTrackNWP validation Enrich the APIs and release StormTrek app (now in beta) under Android and iOS Extend the nowcasting to 60 mins (StormCast module) Extend the coverage using other sats (Himawari, MSG1 IODC, GOES) Apply the algorithm to other kind of data (NEXRAD) Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 26. Acknoledgements COMET for the support The Datameteo company Massimiliano Sist, Matteo Picchiani, prof. Fabio Del Frate Cecilia Marcos Martin (AEMET) for inspiring me to use MET Eumetsat WMS team Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016
  • 27. Thanks for your attention Questions? michele.derosa@geok.it www.geo-k.co Eumetsat Conference 2016, Darmstadt 26-30 Sep 2016