際際滷shows by User: DarioBuonoPhDinEcono / http://www.slideshare.net/images/logo.gif 際際滷shows by User: DarioBuonoPhDinEcono / Sun, 08 Sep 2024 10:01:48 GMT 際際滷Share feed for 際際滷shows by User: DarioBuonoPhDinEcono Introduction to LLMs and their relevance for Official Statistics /slideshow/introduction-to-llms-and-their-relevance-for-official-statistics/271646110 uscensus-final-240908100148-47c7185a
An introduction to large language models and their relevance for statistical offices , 2024 edition. Can be downloaded at https://op.europa.eu/en/publication-detail/-/publication/f4a703b3-ea60-11ee-bf53-01aa75ed71a1/language-en This manual is a straightforward resource for data professionals of Statistical Offices, introducing the use of Large Language Models (LLMs) in the field of Official Statistics. It outlines how LLMs can tackle complex data problems with their advanced language processing capabilities and integrates these models into current processes. This guide introduces LLMs, delineating their evolution, architecture, applications, and implications for future employment within the AI realm. Additionally, it emphasizes the need for ethical and responsible applications, blending research insights with practical industry examples to ensure professionals can maximize LLM benefits while maintaining trust and reliability in their work.]]>

An introduction to large language models and their relevance for statistical offices , 2024 edition. Can be downloaded at https://op.europa.eu/en/publication-detail/-/publication/f4a703b3-ea60-11ee-bf53-01aa75ed71a1/language-en This manual is a straightforward resource for data professionals of Statistical Offices, introducing the use of Large Language Models (LLMs) in the field of Official Statistics. It outlines how LLMs can tackle complex data problems with their advanced language processing capabilities and integrates these models into current processes. This guide introduces LLMs, delineating their evolution, architecture, applications, and implications for future employment within the AI realm. Additionally, it emphasizes the need for ethical and responsible applications, blending research insights with practical industry examples to ensure professionals can maximize LLM benefits while maintaining trust and reliability in their work.]]>
Sun, 08 Sep 2024 10:01:48 GMT /slideshow/introduction-to-llms-and-their-relevance-for-official-statistics/271646110 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Introduction to LLMs and their relevance for Official Statistics DarioBuonoPhDinEcono An introduction to large language models and their relevance for statistical offices , 2024 edition. Can be downloaded at https://op.europa.eu/en/publication-detail/-/publication/f4a703b3-ea60-11ee-bf53-01aa75ed71a1/language-en This manual is a straightforward resource for data professionals of Statistical Offices, introducing the use of Large Language Models (LLMs) in the field of Official Statistics. It outlines how LLMs can tackle complex data problems with their advanced language processing capabilities and integrates these models into current processes. This guide introduces LLMs, delineating their evolution, architecture, applications, and implications for future employment within the AI realm. Additionally, it emphasizes the need for ethical and responsible applications, blending research insights with practical industry examples to ensure professionals can maximize LLM benefits while maintaining trust and reliability in their work. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/uscensus-final-240908100148-47c7185a-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> An introduction to large language models and their relevance for statistical offices , 2024 edition. Can be downloaded at https://op.europa.eu/en/publication-detail/-/publication/f4a703b3-ea60-11ee-bf53-01aa75ed71a1/language-en This manual is a straightforward resource for data professionals of Statistical Offices, introducing the use of Large Language Models (LLMs) in the field of Official Statistics. It outlines how LLMs can tackle complex data problems with their advanced language processing capabilities and integrates these models into current processes. This guide introduces LLMs, delineating their evolution, architecture, applications, and implications for future employment within the AI realm. Additionally, it emphasizes the need for ethical and responsible applications, blending research insights with practical industry examples to ensure professionals can maximize LLM benefits while maintaining trust and reliability in their work.
Introduction to LLMs and their relevance for Official Statistics from Dario Buono
]]>
70 0 https://cdn.slidesharecdn.com/ss_thumbnails/uscensus-final-240908100148-47c7185a-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Big Data and Nowcasting /slideshow/big-data-and-nowcasting/101317432 table3bigdataandnowcastingbuono-180608133523
SAMPLEAU Workshop Panle discussion about new and traditional data sources for local indicators is it possible reconciling small areas and big data? ]]>

SAMPLEAU Workshop Panle discussion about new and traditional data sources for local indicators is it possible reconciling small areas and big data? ]]>
Fri, 08 Jun 2018 13:35:23 GMT /slideshow/big-data-and-nowcasting/101317432 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Big Data and Nowcasting DarioBuonoPhDinEcono SAMPLEAU Workshop Panle discussion about new and traditional data sources for local indicators is it possible reconciling small areas and big data? <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/table3bigdataandnowcastingbuono-180608133523-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> SAMPLEAU Workshop Panle discussion about new and traditional data sources for local indicators is it possible reconciling small areas and big data?
Big Data and Nowcasting from Dario Buono
]]>
697 5 https://cdn.slidesharecdn.com/ss_thumbnails/table3bigdataandnowcastingbuono-180608133523-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Reporting uncertainties - too much information? /DarioBuonoPhDinEcono/reporting-uncertainties-too-much-information ips019s2iaccarino-170714100346
Ocial statistics are published by government agencies and other international institutes to provide infor- mation on the economy, living conditions, social development etc. These metrics are evaluated using dierent sources, primarily surveys and censuses and, in addition, data obtained from government administrations or private sector information. Several qualitative criteria are considered the basis for trustworthy ocial statistics, such as impartiality, transparency, relevance and independency. However, as the published metrics are derived from statistical analysis of imperfect and potentially incomplete data, errors and uncertainties are inevitable and, in some cases, require revisions or corrections that can lead to reduced con dence in the overall process. It is also important to recognize that the uncertainties originate both from statistical errors, such as the use of limited raw data, and from bias induced by incomplete information or modeling assumptions. Understanding how dierent sources of errors lead to bias and variance helps us improve the overall process resulting in more accurate predictions. Quantitative measures of the variance errors are commonplace and easy to convey; among those the standard-error-in the mean is perhaps the most popular and results in the  symbol. Measures of the spread in the actual data are also easy to estimate and disseminated using the variance, or more frequently the standard deviation. More complete representation of the statistical spread in the row data leads to percentiles and, eventually, to reporting the complete probability distributions. Measures of bias, on the other hand, are not well developed because in many cases are not directly computable. In the engineering community rather than presenting the variance and bias error, the focus is to identify and rank the sources of uncertainties that explain the imprecision in the estimates. In this work we will discuss applications of two global sensitivity metrics, the Sobol indices and the active subspace variables as tools to describe the variance errors. Furthermore, we will discuss the distance metric as a strategy to assess bias errors derived from classical measures of discrepancies between probability distribution functions.]]>

Ocial statistics are published by government agencies and other international institutes to provide infor- mation on the economy, living conditions, social development etc. These metrics are evaluated using dierent sources, primarily surveys and censuses and, in addition, data obtained from government administrations or private sector information. Several qualitative criteria are considered the basis for trustworthy ocial statistics, such as impartiality, transparency, relevance and independency. However, as the published metrics are derived from statistical analysis of imperfect and potentially incomplete data, errors and uncertainties are inevitable and, in some cases, require revisions or corrections that can lead to reduced con dence in the overall process. It is also important to recognize that the uncertainties originate both from statistical errors, such as the use of limited raw data, and from bias induced by incomplete information or modeling assumptions. Understanding how dierent sources of errors lead to bias and variance helps us improve the overall process resulting in more accurate predictions. Quantitative measures of the variance errors are commonplace and easy to convey; among those the standard-error-in the mean is perhaps the most popular and results in the  symbol. Measures of the spread in the actual data are also easy to estimate and disseminated using the variance, or more frequently the standard deviation. More complete representation of the statistical spread in the row data leads to percentiles and, eventually, to reporting the complete probability distributions. Measures of bias, on the other hand, are not well developed because in many cases are not directly computable. In the engineering community rather than presenting the variance and bias error, the focus is to identify and rank the sources of uncertainties that explain the imprecision in the estimates. In this work we will discuss applications of two global sensitivity metrics, the Sobol indices and the active subspace variables as tools to describe the variance errors. Furthermore, we will discuss the distance metric as a strategy to assess bias errors derived from classical measures of discrepancies between probability distribution functions.]]>
Fri, 14 Jul 2017 10:03:46 GMT /DarioBuonoPhDinEcono/reporting-uncertainties-too-much-information DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Reporting uncertainties - too much information? DarioBuonoPhDinEcono O鐃cial statistics are published by government agencies and other international institutes to provide infor- mation on the economy, living conditions, social development etc. These metrics are evaluated using di鐃erent sources, primarily surveys and censuses and, in addition, data obtained from government administrations or private sector information. Several qualitative criteria are considered the basis for trustworthy o鐃cial statistics, such as impartiality, transparency, relevance and independency. However, as the published metrics are derived from statistical analysis of imperfect and potentially incomplete data, errors and uncertainties are inevitable and, in some cases, require revisions or corrections that can lead to reduced con鐃dence in the overall process. It is also important to recognize that the uncertainties originate both from statistical errors, such as the use of limited raw data, and from bias induced by incomplete information or modeling assumptions. Understanding how di鐃erent sources of errors lead to bias and variance helps us improve the overall process resulting in more accurate predictions. Quantitative measures of the variance errors are commonplace and easy to convey; among those the standard-error-in the mean is perhaps the most popular and results in the 鐃 symbol. Measures of the spread in the actual data are also easy to estimate and disseminated using the variance, or more frequently the standard deviation. More complete representation of the statistical spread in the row data leads to percentiles and, eventually, to reporting the complete probability distributions. Measures of bias, on the other hand, are not well developed because in many cases are not directly computable. In the engineering community rather than presenting the variance and bias error, the focus is to identify and rank the sources of uncertainties that explain the imprecision in the estimates. In this work we will discuss applications of two global sensitivity metrics, the Sobol indices and the active subspace variables as tools to describe the variance errors. Furthermore, we will discuss the distance metric as a strategy to assess bias errors derived from classical measures of discrepancies between probability distribution functions. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/ips019s2iaccarino-170714100346-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> O鐃cial statistics are published by government agencies and other international institutes to provide infor- mation on the economy, living conditions, social development etc. These metrics are evaluated using di鐃erent sources, primarily surveys and censuses and, in addition, data obtained from government administrations or private sector information. Several qualitative criteria are considered the basis for trustworthy o鐃cial statistics, such as impartiality, transparency, relevance and independency. However, as the published metrics are derived from statistical analysis of imperfect and potentially incomplete data, errors and uncertainties are inevitable and, in some cases, require revisions or corrections that can lead to reduced con鐃dence in the overall process. It is also important to recognize that the uncertainties originate both from statistical errors, such as the use of limited raw data, and from bias induced by incomplete information or modeling assumptions. Understanding how di鐃erent sources of errors lead to bias and variance helps us improve the overall process resulting in more accurate predictions. Quantitative measures of the variance errors are commonplace and easy to convey; among those the standard-error-in the mean is perhaps the most popular and results in the 鐃 symbol. Measures of the spread in the actual data are also easy to estimate and disseminated using the variance, or more frequently the standard deviation. More complete representation of the statistical spread in the row data leads to percentiles and, eventually, to reporting the complete probability distributions. Measures of bias, on the other hand, are not well developed because in many cases are not directly computable. In the engineering community rather than presenting the variance and bias error, the focus is to identify and rank the sources of uncertainties that explain the imprecision in the estimates. In this work we will discuss applications of two global sensitivity metrics, the Sobol indices and the active subspace variables as tools to describe the variance errors. Furthermore, we will discuss the distance metric as a strategy to assess bias errors derived from classical measures of discrepancies between probability distribution functions.
Reporting uncertainties - too much information? from Dario Buono
]]>
609 5 https://cdn.slidesharecdn.com/ss_thumbnails/ips019s2iaccarino-170714100346-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Skills for the new generation of statisticians /slideshow/skills-for-the-new-generation-of-statisticians/77783888 slidesskillsforthenewgenerationofstatisticiansbuonomakinen-170712083848
This presentation analyses the competence profile of official statisticians with a particular focus on new data science competences. Modernization of official statistics will depend on the capability to incorporate new data sources and benefit from disruptive technologies. This will require new capabilities, skills and competences that may not be part of the traditional skill set of official statisticians. The document was presented to the Conference of European Statisticians organised at the United Nation in Geneva]]>

This presentation analyses the competence profile of official statisticians with a particular focus on new data science competences. Modernization of official statistics will depend on the capability to incorporate new data sources and benefit from disruptive technologies. This will require new capabilities, skills and competences that may not be part of the traditional skill set of official statisticians. The document was presented to the Conference of European Statisticians organised at the United Nation in Geneva]]>
Wed, 12 Jul 2017 08:38:48 GMT /slideshow/skills-for-the-new-generation-of-statisticians/77783888 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Skills for the new generation of statisticians DarioBuonoPhDinEcono This presentation analyses the competence profile of official statisticians with a particular focus on new data science competences. Modernization of official statistics will depend on the capability to incorporate new data sources and benefit from disruptive technologies. This will require new capabilities, skills and competences that may not be part of the traditional skill set of official statisticians. The document was presented to the Conference of European Statisticians organised at the United Nation in Geneva <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/slidesskillsforthenewgenerationofstatisticiansbuonomakinen-170712083848-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> This presentation analyses the competence profile of official statisticians with a particular focus on new data science competences. Modernization of official statistics will depend on the capability to incorporate new data sources and benefit from disruptive technologies. This will require new capabilities, skills and competences that may not be part of the traditional skill set of official statisticians. The document was presented to the Conference of European Statisticians organised at the United Nation in Geneva
Skills for the new generation of statisticians from Dario Buono
]]>
805 2 https://cdn.slidesharecdn.com/ss_thumbnails/slidesskillsforthenewgenerationofstatisticiansbuonomakinen-170712083848-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
JDemetra+ Java Tool for Seasonal Adjustment /slideshow/jdemetra-java-tool-for-seasonal-adjustment/77743626 sts043s2buono-170711100456
JDemetra+ is a tool for seasonal adjustment (SA) developed by the National Bank of Belgium (NBB) in cooperation with the Deutsche Bundesbank and Eurostat in accordance with the Guidelines of the European Statistical System (ESS). User support, training and methodological development is provided by the devoted Centre of Excellence on Seasonal Adjustment coordinated by INSEE, the French National Statistical Office. JDemetra+ has been officially recommended, since February 2015, to the members of the ESS and the European System of Central Banks as software for seasonal and calendar adjustment of official statistics.]]>

JDemetra+ is a tool for seasonal adjustment (SA) developed by the National Bank of Belgium (NBB) in cooperation with the Deutsche Bundesbank and Eurostat in accordance with the Guidelines of the European Statistical System (ESS). User support, training and methodological development is provided by the devoted Centre of Excellence on Seasonal Adjustment coordinated by INSEE, the French National Statistical Office. JDemetra+ has been officially recommended, since February 2015, to the members of the ESS and the European System of Central Banks as software for seasonal and calendar adjustment of official statistics.]]>
Tue, 11 Jul 2017 10:04:56 GMT /slideshow/jdemetra-java-tool-for-seasonal-adjustment/77743626 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) JDemetra+ Java Tool for Seasonal Adjustment DarioBuonoPhDinEcono JDemetra+ is a tool for seasonal adjustment (SA) developed by the National Bank of Belgium (NBB) in cooperation with the Deutsche Bundesbank and Eurostat in accordance with the Guidelines of the European Statistical System (ESS). User support, training and methodological development is provided by the devoted Centre of Excellence on Seasonal Adjustment coordinated by INSEE, the French National Statistical Office. JDemetra+ has been officially recommended, since February 2015, to the members of the ESS and the European System of Central Banks as software for seasonal and calendar adjustment of official statistics. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/sts043s2buono-170711100456-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> JDemetra+ is a tool for seasonal adjustment (SA) developed by the National Bank of Belgium (NBB) in cooperation with the Deutsche Bundesbank and Eurostat in accordance with the Guidelines of the European Statistical System (ESS). User support, training and methodological development is provided by the devoted Centre of Excellence on Seasonal Adjustment coordinated by INSEE, the French National Statistical Office. JDemetra+ has been officially recommended, since February 2015, to the members of the ESS and the European System of Central Banks as software for seasonal and calendar adjustment of official statistics.
JDemetra+ Java Tool for Seasonal Adjustment from Dario Buono
]]>
2292 2 https://cdn.slidesharecdn.com/ss_thumbnails/sts043s2buono-170711100456-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Big data and macroeconomic nowcasting from data access to modelling /slideshow/big-data-and-macroeconomic-nowcasting-from-data-access-to-modelling/70060186 session6slidesbuonokrischebigdataandmacroeconomicnowcastingfromdataaccesstomodelling-161212151445
Parallel advances in IT and in the social use of Internet-related applications, provide the general public with access to a vast amount of information. The associated Big Data are potentially very useful for a variety of applications, ranging from marketing to tapering fiscal evasion. From the point of view of official statistics, the main question is whether and to what extent Big Data are a field worth investing to expand, check and improve the data production process and which types of partnerships will have to be formed for this purpose. Nowcasting of macroeconomic indicators represents a well-identified field where Big Data has the potential to play a decisive role in the future. In this paper we present the results and main recommendations from the Eurostat-funded project Big Data and macroeconomic nowcasting, implemented by GOPA Consultants, which benefits from the cooperation and work of the Eurostat task force on Big Data and a few external academic experts.]]>

Parallel advances in IT and in the social use of Internet-related applications, provide the general public with access to a vast amount of information. The associated Big Data are potentially very useful for a variety of applications, ranging from marketing to tapering fiscal evasion. From the point of view of official statistics, the main question is whether and to what extent Big Data are a field worth investing to expand, check and improve the data production process and which types of partnerships will have to be formed for this purpose. Nowcasting of macroeconomic indicators represents a well-identified field where Big Data has the potential to play a decisive role in the future. In this paper we present the results and main recommendations from the Eurostat-funded project Big Data and macroeconomic nowcasting, implemented by GOPA Consultants, which benefits from the cooperation and work of the Eurostat task force on Big Data and a few external academic experts.]]>
Mon, 12 Dec 2016 15:14:45 GMT /slideshow/big-data-and-macroeconomic-nowcasting-from-data-access-to-modelling/70060186 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Big data and macroeconomic nowcasting from data access to modelling DarioBuonoPhDinEcono Parallel advances in IT and in the social use of Internet-related applications, provide the general public with access to a vast amount of information. The associated Big Data are potentially very useful for a variety of applications, ranging from marketing to tapering fiscal evasion. From the point of view of official statistics, the main question is whether and to what extent Big Data are a field worth investing to expand, check and improve the data production process and which types of partnerships will have to be formed for this purpose. Nowcasting of macroeconomic indicators represents a well-identified field where Big Data has the potential to play a decisive role in the future. In this paper we present the results and main recommendations from the Eurostat-funded project Big Data and macroeconomic nowcasting, implemented by GOPA Consultants, which benefits from the cooperation and work of the Eurostat task force on Big Data and a few external academic experts. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/session6slidesbuonokrischebigdataandmacroeconomicnowcastingfromdataaccesstomodelling-161212151445-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Parallel advances in IT and in the social use of Internet-related applications, provide the general public with access to a vast amount of information. The associated Big Data are potentially very useful for a variety of applications, ranging from marketing to tapering fiscal evasion. From the point of view of official statistics, the main question is whether and to what extent Big Data are a field worth investing to expand, check and improve the data production process and which types of partnerships will have to be formed for this purpose. Nowcasting of macroeconomic indicators represents a well-identified field where Big Data has the potential to play a decisive role in the future. In this paper we present the results and main recommendations from the Eurostat-funded project Big Data and macroeconomic nowcasting, implemented by GOPA Consultants, which benefits from the cooperation and work of the Eurostat task force on Big Data and a few external academic experts.
Big data and macroeconomic nowcasting from data access to modelling from Dario Buono
]]>
550 7 https://cdn.slidesharecdn.com/ss_thumbnails/session6slidesbuonokrischebigdataandmacroeconomicnowcastingfromdataaccesstomodelling-161212151445-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Big Data Analysis: The curse of dimensionality in official statistics /slideshow/big-data-analysys-the-curse-of-dimensionality-in-official-statistics/67502761 sessiond7baldaccibuonograsthecurseofdimensionalityinofficialstatisticsfinal-161021140357
Statistical authorities need to produce accurate data faster and in a cost effective way, to become more responsive to users卒 demands, while at the same time continuing to provide high quality output. One way to fulfil this is to make use of all new accessible data sources, as for example administrative data and big data. As a result, statistical offices will have to deal more and more with a "huge" number" of time series, in particular for producing model based statistics. Using high dimensional datasets will most likely urge statistical authorities to follow a different approach, in particular to be conscious that the measurement of socio-economic variables will follow more and more non-linear processes that could not be described by probability distributions that could be easily described by few parameters. It will thus imply to adapt the way to observe the world through data taking into account at a greater extent uncertainty and complexity, which will in turn impact dissemination and communication activities of statistical authorities. ]]>

Statistical authorities need to produce accurate data faster and in a cost effective way, to become more responsive to users卒 demands, while at the same time continuing to provide high quality output. One way to fulfil this is to make use of all new accessible data sources, as for example administrative data and big data. As a result, statistical offices will have to deal more and more with a "huge" number" of time series, in particular for producing model based statistics. Using high dimensional datasets will most likely urge statistical authorities to follow a different approach, in particular to be conscious that the measurement of socio-economic variables will follow more and more non-linear processes that could not be described by probability distributions that could be easily described by few parameters. It will thus imply to adapt the way to observe the world through data taking into account at a greater extent uncertainty and complexity, which will in turn impact dissemination and communication activities of statistical authorities. ]]>
Fri, 21 Oct 2016 14:03:57 GMT /slideshow/big-data-analysys-the-curse-of-dimensionality-in-official-statistics/67502761 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Big Data Analysis: The curse of dimensionality in official statistics DarioBuonoPhDinEcono Statistical authorities need to produce accurate data faster and in a cost effective way, to become more responsive to users卒 demands, while at the same time continuing to provide high quality output. One way to fulfil this is to make use of all new accessible data sources, as for example administrative data and big data. As a result, statistical offices will have to deal more and more with a "huge" number" of time series, in particular for producing model based statistics. Using high dimensional datasets will most likely urge statistical authorities to follow a different approach, in particular to be conscious that the measurement of socio-economic variables will follow more and more non-linear processes that could not be described by probability distributions that could be easily described by few parameters. It will thus imply to adapt the way to observe the world through data taking into account at a greater extent uncertainty and complexity, which will in turn impact dissemination and communication activities of statistical authorities. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/sessiond7baldaccibuonograsthecurseofdimensionalityinofficialstatisticsfinal-161021140357-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Statistical authorities need to produce accurate data faster and in a cost effective way, to become more responsive to users卒 demands, while at the same time continuing to provide high quality output. One way to fulfil this is to make use of all new accessible data sources, as for example administrative data and big data. As a result, statistical offices will have to deal more and more with a &quot;huge&quot; number&quot; of time series, in particular for producing model based statistics. Using high dimensional datasets will most likely urge statistical authorities to follow a different approach, in particular to be conscious that the measurement of socio-economic variables will follow more and more non-linear processes that could not be described by probability distributions that could be easily described by few parameters. It will thus imply to adapt the way to observe the world through data taking into account at a greater extent uncertainty and complexity, which will in turn impact dissemination and communication activities of statistical authorities.
Big Data Analysis: The curse of dimensionality in official statistics from Dario Buono
]]>
1068 5 https://cdn.slidesharecdn.com/ss_thumbnails/sessiond7baldaccibuonograsthecurseofdimensionalityinofficialstatisticsfinal-161021140357-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Physics4Stats & BMI vs. QoL /slideshow/from-phisics-to-statistics-and-body-mass-index-and-quality-of-life/66847502 methnetbuono-161007081729
1st idea: replace growth rates with acceleration and use F=Ma 2nd idea: Link Body Mass Index and Quality of Life Data]]>

1st idea: replace growth rates with acceleration and use F=Ma 2nd idea: Link Body Mass Index and Quality of Life Data]]>
Fri, 07 Oct 2016 08:17:29 GMT /slideshow/from-phisics-to-statistics-and-body-mass-index-and-quality-of-life/66847502 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Physics4Stats & BMI vs. QoL DarioBuonoPhDinEcono 1st idea: replace growth rates with acceleration and use F=Ma 2nd idea: Link Body Mass Index and Quality of Life Data <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/methnetbuono-161007081729-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 1st idea: replace growth rates with acceleration and use F=Ma 2nd idea: Link Body Mass Index and Quality of Life Data
Physics4Stats & BMI vs. QoL from Dario Buono
]]>
209 3 https://cdn.slidesharecdn.com/ss_thumbnails/methnetbuono-161007081729-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Methodological network and strategy /slideshow/methodological-network-and-strategy/66820734 methodologicalnetworkandstrategyunitb1-161006180105
Medium term priorities]]>

Medium term priorities]]>
Thu, 06 Oct 2016 18:01:05 GMT /slideshow/methodological-network-and-strategy/66820734 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Methodological network and strategy DarioBuonoPhDinEcono Medium term priorities <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/methodologicalnetworkandstrategyunitb1-161006180105-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Medium term priorities
Methodological network and strategy from Dario Buono
]]>
250 4 https://cdn.slidesharecdn.com/ss_thumbnails/methodologicalnetworkandstrategyunitb1-161006180105-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Safebook quality grading /slideshow/safebook-quality-grading/66820571 20141006directorpresentationsafebook-161006175650
Ranking/labelling/clustering MIP scoreboard ]]>

Ranking/labelling/clustering MIP scoreboard ]]>
Thu, 06 Oct 2016 17:56:50 GMT /slideshow/safebook-quality-grading/66820571 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Safebook quality grading DarioBuonoPhDinEcono Ranking/labelling/clustering MIP scoreboard <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20141006directorpresentationsafebook-161006175650-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Ranking/labelling/clustering MIP scoreboard
Safebook quality grading from Dario Buono
]]>
213 4 https://cdn.slidesharecdn.com/ss_thumbnails/20141006directorpresentationsafebook-161006175650-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MIP: Analysis of metadata and data revisions /DarioBuonoPhDinEcono/mip-analysis-of-metadata-and-data-revisions 20131213dmesitem11-5finaldb-161006175354
Macroeconomic Imbalances Procedure ]]>

Macroeconomic Imbalances Procedure ]]>
Thu, 06 Oct 2016 17:53:53 GMT /DarioBuonoPhDinEcono/mip-analysis-of-metadata-and-data-revisions DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) MIP: Analysis of metadata and data revisions DarioBuonoPhDinEcono Macroeconomic Imbalances Procedure <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20131213dmesitem11-5finaldb-161006175354-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Macroeconomic Imbalances Procedure
MIP: Analysis of metadata and data revisions from Dario Buono
]]>
176 4 https://cdn.slidesharecdn.com/ss_thumbnails/20131213dmesitem11-5finaldb-161006175354-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
New innovative 3 way anova a-priori test for direct vs. indirect approach in seasonal adjustment /slideshow/new-innovative-3-way-anova-apriori-test-for-direct-vs-indirect-approach-in-seasonal-adjustment/66820100 newinnovative3-wayanovaa-prioritestfordirectvs-161006174240
European Indicators]]>

European Indicators]]>
Thu, 06 Oct 2016 17:42:40 GMT /slideshow/new-innovative-3-way-anova-apriori-test-for-direct-vs-indirect-approach-in-seasonal-adjustment/66820100 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) New innovative 3 way anova a-priori test for direct vs. indirect approach in seasonal adjustment DarioBuonoPhDinEcono European Indicators <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/newinnovative3-wayanovaa-prioritestfordirectvs-161006174240-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> European Indicators
New innovative 3 way anova a-priori test for direct vs. indirect approach in seasonal adjustment from Dario Buono
]]>
369 8 https://cdn.slidesharecdn.com/ss_thumbnails/newinnovative3-wayanovaa-prioritestfordirectvs-161006174240-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Eurostat tools for benchmarking and seasonal adjustment j_demetra+ and jecotrim_buono_final_20130820 /slideshow/eurostat-tools-for-benchmarking-and-seasonal-adjustment-jdemetra-and-jecotrimbuonofinal20130820/66819953 sessionss2r5eurostattoolsforbenchmarkingandseasonaladjustmentjdemetraandjecotrimbuonofinal20130820-161006173815
Practical issues in benchmarking and seasonal Adjustment]]>

Practical issues in benchmarking and seasonal Adjustment]]>
Thu, 06 Oct 2016 17:38:15 GMT /slideshow/eurostat-tools-for-benchmarking-and-seasonal-adjustment-jdemetra-and-jecotrimbuonofinal20130820/66819953 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Eurostat tools for benchmarking and seasonal adjustment j_demetra+ and jecotrim_buono_final_20130820 DarioBuonoPhDinEcono Practical issues in benchmarking and seasonal Adjustment <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/sessionss2r5eurostattoolsforbenchmarkingandseasonaladjustmentjdemetraandjecotrimbuonofinal20130820-161006173815-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Practical issues in benchmarking and seasonal Adjustment
Eurostat tools for benchmarking and seasonal adjustment j_demetra+ and jecotrim_buono_final_20130820 from Dario Buono
]]>
881 3 https://cdn.slidesharecdn.com/ss_thumbnails/sessionss2r5eurostattoolsforbenchmarkingandseasonaladjustmentjdemetraandjecotrimbuonofinal20130820-161006173815-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Detecting outliers at the end of the series using forecast intervals /DarioBuonoPhDinEcono/detecting-outliers-at-the-end-of-the-series-using-forecast-intervals detectingoutliersattheendoftheseriesusingforecastintervals-161006173340
New technique for predictability, uncertainty, implied volatility and statistical analysis of Risk using SARIMA forecasts intervals]]>

New technique for predictability, uncertainty, implied volatility and statistical analysis of Risk using SARIMA forecasts intervals]]>
Thu, 06 Oct 2016 17:33:40 GMT /DarioBuonoPhDinEcono/detecting-outliers-at-the-end-of-the-series-using-forecast-intervals DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Detecting outliers at the end of the series using forecast intervals DarioBuonoPhDinEcono New technique for predictability, uncertainty, implied volatility and statistical analysis of Risk using SARIMA forecasts intervals <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/detectingoutliersattheendoftheseriesusingforecastintervals-161006173340-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> New technique for predictability, uncertainty, implied volatility and statistical analysis of Risk using SARIMA forecasts intervals
Detecting outliers at the end of the series using forecast intervals from Dario Buono
]]>
139 2 https://cdn.slidesharecdn.com/ss_thumbnails/detectingoutliersattheendoftheseriesusingforecastintervals-161006173340-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
1 out of 20 scenarios /slideshow/1-out-of-20-scenarios/66819725 buono-etaltemporaldisaggregationntts2015-161006173203
1 out of 20 possible scenarios: how to perform temporal disaggregation of annual sector accounts data]]>

1 out of 20 possible scenarios: how to perform temporal disaggregation of annual sector accounts data]]>
Thu, 06 Oct 2016 17:32:02 GMT /slideshow/1-out-of-20-scenarios/66819725 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) 1 out of 20 scenarios DarioBuonoPhDinEcono 1 out of 20 possible scenarios: 鐃how to perform temporal disaggregation of annual sector accounts data <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/buono-etaltemporaldisaggregationntts2015-161006173203-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 1 out of 20 possible scenarios: 鐃how to perform temporal disaggregation of annual sector accounts data
1 out of 20 scenarios from Dario Buono
]]>
114 3 https://cdn.slidesharecdn.com/ss_thumbnails/buono-etaltemporaldisaggregationntts2015-161006173203-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Eurostat methodological skills staff survey lesson learned final /DarioBuonoPhDinEcono/eurostat-methodological-skills-staff-survey-lesson-learned-final eurostatmethodologicalskillsstaffsurveylessonlearnedfinal-161006172754
Staff Survey]]>

Staff Survey]]>
Thu, 06 Oct 2016 17:27:54 GMT /DarioBuonoPhDinEcono/eurostat-methodological-skills-staff-survey-lesson-learned-final DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Eurostat methodological skills staff survey lesson learned final DarioBuonoPhDinEcono Staff Survey <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/eurostatmethodologicalskillsstaffsurveylessonlearnedfinal-161006172754-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Staff Survey
Eurostat methodological skills staff survey lesson learned final from Dario Buono
]]>
138 6 https://cdn.slidesharecdn.com/ss_thumbnails/eurostatmethodologicalskillsstaffsurveylessonlearnedfinal-161006172754-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Reliability of estimates in socio-demographic groups with small samples /slideshow/reliability-of-estimates-in-sociodemographic-groups-with-small-samples/65816118 buonosaefinal-160908094829
The aim of this work is twofold: to investigate the possibilities of model-based approach implementation in the official statistics so to ensure reliability of data for social conditions by different breakdowns; and to discuss advantages, disadvantages, and the potentiality of use of small area estimation techniques and tools in production of the official statistics. In order to try to analyse fitting of the models to different type of data, various run were conducted using several small area estimation techniques (such as empirical Bayesian, hierarchical Bayes, etc.) already built-in within the R software (packages sae, hbsae, etc.) to obtain area and unit level based at-risk-of-poverty estimates and the mean squared errors of the estimates]]>

The aim of this work is twofold: to investigate the possibilities of model-based approach implementation in the official statistics so to ensure reliability of data for social conditions by different breakdowns; and to discuss advantages, disadvantages, and the potentiality of use of small area estimation techniques and tools in production of the official statistics. In order to try to analyse fitting of the models to different type of data, various run were conducted using several small area estimation techniques (such as empirical Bayesian, hierarchical Bayes, etc.) already built-in within the R software (packages sae, hbsae, etc.) to obtain area and unit level based at-risk-of-poverty estimates and the mean squared errors of the estimates]]>
Thu, 08 Sep 2016 09:48:29 GMT /slideshow/reliability-of-estimates-in-sociodemographic-groups-with-small-samples/65816118 DarioBuonoPhDinEcono@slideshare.net(DarioBuonoPhDinEcono) Reliability of estimates in socio-demographic groups with small samples DarioBuonoPhDinEcono The aim of this work is twofold: to investigate the possibilities of model-based approach implementation in the official statistics so to ensure reliability of data for social conditions by different breakdowns; and to discuss advantages, disadvantages, and the potentiality of use of small area estimation techniques and tools in production of the official statistics. In order to try to analyse fitting of the models to different type of data, various run were conducted using several small area estimation techniques (such as empirical Bayesian, hierarchical Bayes, etc.) already built-in within the R software (packages sae, hbsae, etc.) to obtain area and unit level based at-risk-of-poverty estimates and the mean squared errors of the estimates <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/buonosaefinal-160908094829-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> The aim of this work is twofold: to investigate the possibilities of model-based approach implementation in the official statistics so to ensure reliability of data for social conditions by different breakdowns; and to discuss advantages, disadvantages, and the potentiality of use of small area estimation techniques and tools in production of the official statistics. In order to try to analyse fitting of the models to different type of data, various run were conducted using several small area estimation techniques (such as empirical Bayesian, hierarchical Bayes, etc.) already built-in within the R software (packages sae, hbsae, etc.) to obtain area and unit level based at-risk-of-poverty estimates and the mean squared errors of the estimates
Reliability of estimates in socio-demographic groups with small samples from Dario Buono
]]>
130 4 https://cdn.slidesharecdn.com/ss_thumbnails/buonosaefinal-160908094829-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-DarioBuonoPhDinEcono-48x48.jpg?cb=1725789070 Doing Time Series Econometrics and developoing Data Analytics @EU_Eurostat Training NSIs in Seasonal Adjustment and Temporal Disaggregation using JDEMETRA+. Interested in new methods for new data WiDS ambassador ec.europa.eu/eurostat https://cdn.slidesharecdn.com/ss_thumbnails/uscensus-final-240908100148-47c7185a-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/introduction-to-llms-and-their-relevance-for-official-statistics/271646110 Introduction to LLMs a... https://cdn.slidesharecdn.com/ss_thumbnails/table3bigdataandnowcastingbuono-180608133523-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/big-data-and-nowcasting/101317432 Big Data and Nowcasting https://cdn.slidesharecdn.com/ss_thumbnails/ips019s2iaccarino-170714100346-thumbnail.jpg?width=320&height=320&fit=bounds DarioBuonoPhDinEcono/reporting-uncertainties-too-much-information Reporting uncertaintie...