In: Wasserwirtschaft: Hydrologie, Wasserbau, Boden, Ökologie ; Organ der Deutschen Vereinigung für Wasserwirtschaft, Abwasser und Abfall, Band 109, Heft 7-8, S. 44-47
International audience ; Eurofleets+ adopted a Data Policy making EF+ cruise data fi ndable, accessible, interoperable and reusable (FAIR). Data management (DM) is integrated and deployed in synergy with SeaDataNet, a European network of NODCs. The DM strategy is to ensure metadata and data of TA (Transnational Access) cruises to become available for dissemination and inclusion in major European and global marine data exchange systems. To achieve this, research teams are required to formulate their cruise DM plans, and use components designed for deploying the EF+ DM strategy: 1) equip RVs with a shipboard system (EARS) to gather and transfer metadata and data as acquired during cruises, both by automatic systems and manual entries; 2) assign DM experts (NODCs) to assist principal investigators and vessel operators, before, during, and after the TA cruises; 3) validate and archive all gathered metadata and data at NODCs for long term stewardship, and wider distribution, using SeaDataNet for exchange and publishing at several European and international portals. A central distinction is made between 'en-route' data acquired by fi xed sensors, and 'manual' data from human operations, which requires post-processing, e.g. analysing samples. EARS will gather 'en-route data' for regular transfer and publishing at EF+ EVIOR portal in a dynamic vessel tracking interface, using SWE techniques. ; Eurofleets+ a adopté une politique de données qui rend les données de croisière d'EF+ fiables, accessibles, interopérables et réutilisables (FAIR). La gestion des données (DM) est intégrée et déployée en synergie avec SeaDataNet, un réseau européen de CNDO. La stratégie de gestion des données consiste à faire en sorte que les métadonnées et les données des croisières TA (Transnational Access) soient disponibles pour la diffusion et l'inclusion dans les principaux systèmes européens et mondiaux d'échange de données marines. Pour y parvenir, les équipes de recherche doivent formuler leurs plans de gestion des données de croisière et ...
International audience ; Eurofleets+ adopted a Data Policy making EF+ cruise data fi ndable, accessible, interoperable and reusable (FAIR). Data management (DM) is integrated and deployed in synergy with SeaDataNet, a European network of NODCs. The DM strategy is to ensure metadata and data of TA (Transnational Access) cruises to become available for dissemination and inclusion in major European and global marine data exchange systems. To achieve this, research teams are required to formulate their cruise DM plans, and use components designed for deploying the EF+ DM strategy: 1) equip RVs with a shipboard system (EARS) to gather and transfer metadata and data as acquired during cruises, both by automatic systems and manual entries; 2) assign DM experts (NODCs) to assist principal investigators and vessel operators, before, during, and after the TA cruises; 3) validate and archive all gathered metadata and data at NODCs for long term stewardship, and wider distribution, using SeaDataNet for exchange and publishing at several European and international portals. A central distinction is made between 'en-route' data acquired by fi xed sensors, and 'manual' data from human operations, which requires post-processing, e.g. analysing samples. EARS will gather 'en-route data' for regular transfer and publishing at EF+ EVIOR portal in a dynamic vessel tracking interface, using SWE techniques. ; Eurofleets+ a adopté une politique de données qui rend les données de croisière d'EF+ fiables, accessibles, interopérables et réutilisables (FAIR). La gestion des données (DM) est intégrée et déployée en synergie avec SeaDataNet, un réseau européen de CNDO. La stratégie de gestion des données consiste à faire en sorte que les métadonnées et les données des croisières TA (Transnational Access) soient disponibles pour la diffusion et l'inclusion dans les principaux systèmes européens et mondiaux d'échange de données marines. Pour y parvenir, les équipes de recherche doivent formuler leurs plans de gestion des données de croisière et ...
This article presents a case study on the experimental co-creation process of a digital platform supporting Sustainable Public Food Procurement (SPFP) in public kindergartens in a medium-sized city in Poland. The organisation of SPFP requires a dedicated technological infrastructure to ensure the information flow among food producers, kindergarten employees, children and parents. To this end, a digital platform was designed to enable contact, assessment of food quality and food procurement environmental impact, and the communication of needs and problems among all the actors involved in the food procurement system for kindergartens. The article also discusses the results of the field research and the method of Urban Living Labs, highlighting the key challenges faced by those seeking to combine knowledge about food and the natural environment with public food procurement. The principal difficulties include the availability, accessibility and possible application of data on the environmental costs of food production, the individualisation of needs and motivations related to public catering in educational facilities, and the specific nature of the public sector responsible for public food procurement.
International audience ; Spatial Data Infrastructures (SDI) established during the past two decades "unlocked" heterogeneous geospatial datasets. The European Union INSPIRE Directive laid down the foundation of a pan-European SDI where thousands of public sector data providers make their data, including sensor observations, available for cross-border and cross-domain reuse. At the same time, SDIs should inevitably adopt new technology and standards to remain fit for purpose and address in the best possible way the needs of different stakeholders (government, businesses and citizens). Some of the recurring technical requirements raised by SDI stakeholders include: (i) the need for adoption of RESTful architectures; together with (ii) alternative (to GML) data encodings, such as JavaScript Object Notation (JSON) and binary exchange formats; and (iii) adoption of asynchronous publish-subscribe-based messaging protocols. The newly established OGC standard SensorThings API is particularly interesting to investigate for INSPIRE, as it addresses together all three topics. In this manuscript, we provide our synthesised perspective on the necessary steps for the OGC SensorThings API standard to be considered as a solution that meets the legal obligations stemming out of the INSPIRE Directive. We share our perspective on what should be done concerning: (i) data encoding; and (ii) the use of SensorThings API as a download service.
International audience ; Spatial Data Infrastructures (SDI) established during the past two decades "unlocked" heterogeneous geospatial datasets. The European Union INSPIRE Directive laid down the foundation of a pan-European SDI where thousands of public sector data providers make their data, including sensor observations, available for cross-border and cross-domain reuse. At the same time, SDIs should inevitably adopt new technology and standards to remain fit for purpose and address in the best possible way the needs of different stakeholders (government, businesses and citizens). Some of the recurring technical requirements raised by SDI stakeholders include: (i) the need for adoption of RESTful architectures; together with (ii) alternative (to GML) data encodings, such as JavaScript Object Notation (JSON) and binary exchange formats; and (iii) adoption of asynchronous publish-subscribe-based messaging protocols. The newly established OGC standard SensorThings API is particularly interesting to investigate for INSPIRE, as it addresses together all three topics. In this manuscript, we provide our synthesised perspective on the necessary steps for the OGC SensorThings API standard to be considered as a solution that meets the legal obligations stemming out of the INSPIRE Directive. We share our perspective on what should be done concerning: (i) data encoding; and (ii) the use of SensorThings API as a download service.
International audience ; Spatial Data Infrastructures (SDI) established during the past two decades "unlocked" heterogeneous geospatial datasets. The European Union INSPIRE Directive laid down the foundation of a pan-European SDI where thousands of public sector data providers make their data, including sensor observations, available for cross-border and cross-domain reuse. At the same time, SDIs should inevitably adopt new technology and standards to remain fit for purpose and address in the best possible way the needs of different stakeholders (government, businesses and citizens). Some of the recurring technical requirements raised by SDI stakeholders include: (i) the need for adoption of RESTful architectures; together with (ii) alternative (to GML) data encodings, such as JavaScript Object Notation (JSON) and binary exchange formats; and (iii) adoption of asynchronous publish-subscribe-based messaging protocols. The newly established OGC standard SensorThings API is particularly interesting to investigate for INSPIRE, as it addresses together all three topics. In this manuscript, we provide our synthesised perspective on the necessary steps for the OGC SensorThings API standard to be considered as a solution that meets the legal obligations stemming out of the INSPIRE Directive. We share our perspective on what should be done concerning: (i) data encoding; and (ii) the use of SensorThings API as a download service.
In the next decade the pressures on ocean systems and the communities that rely on them will increase along with impacts from the multiple stressors of climate change and human activities. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge derived from it. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians, and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, augmented and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of statements describe parts of the future vision along with recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end-to-end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next 10 years will be "make or break" for many ocean systems. The decadal challenge is to develop the governance and cooperative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future. ; Peer Reviewed ; Postprint (published version)
In the next decade the pressures on ocean systems and the communities that rely on them will increase along with impacts from the multiple stressors of climate change and human activities. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge derived from it. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians, and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, augmented and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of statements describe parts of the future vision along with recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end-to-end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next 10 years will be "make or break" for many ocean systems. The decadal challenge is to develop the governance and cooperative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future. ; Peer Reviewed ; Postprint (published version)
In the next decade the pressures on ocean systems and the communities that rely on them will increase as multiple stressors of climate change, food security and human activities start to impact. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge generated. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, value added and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of vision statements describe parts of the future vision along with a series of recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end to end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next ten years will be �make or break� for many ocean systems. The decadal challenge is to develop the governance and co-operative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future.
In the next decade the pressures on ocean systems and the communities that rely on them will increase along with impacts from the multiple stressors of climate change and human activities. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge derived from it. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians, and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, augmented and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of statements describe parts of the future vision along with recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end-to-end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next 10 years will be "make or break" for many ocean systems. The decadal challenge is to develop the governance and co-operative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future.