ies-svn.jrc.ec. Web viewIf I have understood the structure of the ... the metadata includes a word...

66
Annex 1 INSPIRE Dashboard ------------------------- D-B 1) Do you think there should be a common dashboard between MS? Yes 16 100% No 0 0% D-B 2) Where should the dashboard be implemented? a) at central level only 7 44% b) at central level and (extended) at Member State level 7 44% Other 2 13% Spain: yes Belgium: at central level, and at MS level WHEN resources are available UK: At Member State level and collated at central level (ie indicators harvested by Commission)

Transcript of ies-svn.jrc.ec. Web viewIf I have understood the structure of the ... the metadata includes a word...

Page 1: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Annex 1

INSPIRE Dashboard

-------------------------

D-B 1) Do you think there should be a common dashboard between MS?

Yes 16 100%

No 0 0%

D-B 2) Where should the dashboard be implemented?

a) at central level only 7 44%

b) at central level and (extended) at Member State level 7 44%

Other 2 13%

Spain: yes

Belgium: at central level, and at MS level WHEN resources are available

UK: At Member State level and collated at central level (ie indicators harvested by Commission)

Page 2: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

D-B 3) The dashboard should contribute to monitor INSPIRE implementation process and progress of EUcountries

Agree 14 88%

Disagree 0 0%

Other 2 13%

Denmark: The dashboard should show which data sets and services each Member States they provide to INSPIRE and which of them are INSPIRE compliant. This overview can inspire other member states in their affords in identifying data sets in scope of annex 1, 2 and 3.

Uk: Agree - but on condition that Member States can control when indicators are calculated.

D-B 3.1) If "disagree" on the above assertion, could you explain why?

UK: As National Contact Point we would be concerned if dynamically calculated indicators in a dashboard were to be used as a basis for enforcing INSPIRE compliance.Therefore we would propose that the dashboard harvest indicators that are calculated in local dashboards, on a cycle determined by Member States with a minimum specified frequency. (Would suggest annually inline with regulations). Indicators could be harvested in the XML format already specified for returning monitoring information or an updated version of it.

D-B 4 - a) One role of the dashboard should be to monitor the implementation status of INSPIRE in the Member States at a certain time, i.e. the dashboard reflects the content of the INSPIRE discovery services on that certain time

Agree 14 88%

Disagree 1 6%

Other 1 6%

Page 3: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

D-B 4 - b) If "disagree" on the above assertion, could you explain why?

Finland: Not sure if certain time means here the real time monitoring. It should contain a view of monitoring status of the end of each year but also real time (for example daily or weekly) status monitoring. Yearly monitoring status view for seeing the long term picture and real time status for seeing the current status.

UK: The time at which the dashboard is monitored should be controlled by each Member State National Contact Point, to avoid mis-reporting.

D-B 5) Which is the main audience of the dashboard?

a) European Commission 1 6%

b) Member States 1 6%

c) Spatial data user community 1 6%

d) all of them 11 69%

Other 2 13%

D-B 5 - b) If "b" on the above assertion, please elaborate on the type of information which could be interesting - useful to the spatial dataset and services users community

Spain: with the abstract (the same as the abstract element of the metadata file)

Finland:- list of national INSPIRE datasets and services- availability of dataset and service metadata and link to available metadata- metadata conformity and a validation report if not conformant- dataset availability in services and link to those services- service conformity and a validation report if not conformant

The Netherlands: It is usefull to know if there is data on a specific theme (not yet available in the portal) for a region

Sweden: The information in the Excel monitoring sheets are of great interest, as well as the custodians of the different datasets. It would help the NCP’s to follow and present the progress and availability of data within the country as well as identifying new cross-border datasets and to find contact organisations in other countries for these. It will also give a good overview on how different countries defines their datasets and what data is available from a cross-border perspective.

UK: Progress in other Member States, particulary information about the types of data that have been published for each theme, achieving consistency accross territories is proving difficult. It will become more difficult as we move

Page 4: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

into transformation. So it would be useful to have this information.

D-B 6) Which information the dashboard should provide?

a) monitoring information only 5 31%

b) monitoring and reporting information 1 6%

c) conformity issues of metadata, data and services (e.g. validation results) 2 13%

d) all of them 4 25%

Other 4 25%

Spain: in two parts:- with dataset and service compliance metadata- with dataset and service non-compliance metadata

Italy: a) & c)

Belgium: monitoring information including validation results (Y/N or even more detailed?) IF all MS use the same validation tools which should be provided by the EC

France: a) & c)

The Netherlands: a + c

Page 5: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

D-B 7 - a) The dashboard should function completely automatically by requesting the INSPIRE discovery services that are registered in the INSPIRE registry and deriving the monitoring information from the metadata the services provide

Agree 5 31%

Disagree 1 6%

Agree but this is not feasible presently 9 56%

Other 1 6%

Belgium: Agree IF metadata is adapted when it should appear that other information is useful/necessary for the dashboard/monitoring (which is at the moment not yet included in the metadata). Also: not all information can be included in the MD (e.g. use of services during a certain period). Therefore it is necessary to allow data input into the dashboard manually or from other sources than the metadata.

D-B 7 - b) If it is currently not feasible, could you explain why?

Denmark: A result of an automatically process would rely of the quality of metadata. And from the Danish point of view the quality of metadata is in some instances varying quality.

Italy: because of the heterogeneous status of implementation in public authorities and it is difficult to have an automatic derivation of monitoring information

Greece: Because in Greece we do not yet have discovery services for all the data and spatial data services that fall under the scope of the INSPIRE directive. We therefore would like to be able to monitor the progress of our data producers manually and we want to know which datasets and services exist and should be compliant to INSPIRE. We think that not compliant data and services should be part of the monitoring list and we want to know that they do not have metadata, they are not INSPIRE compliant and there are not access services for them. The complete automatic monitoring of the implementation of the directive is correct and it is the only way to actually be sure that the monitoring information is real. Unfortunately, it is not feasible yet.

Belgium: -metadata does not contain all the necessary information for automatic monitoring: (1) some fields required for monitoring are not present (yet) in metadata, (2) metadata is not always kept up-to-date by the data provider, (3) some metadata fields which will be necessary for the dashboard are not mandatory. Data providers will have to complete these metadata fields. (2) and (3) will be difficult since metadata is by most data providers considered as a burden and has low priority.-implementation cost to integrate the dashboard in own geoportal (if wanted), and if wanted to extend it with additional information.

Finland: At least in Finland the discovery service includes also INSPIRE conformant metadata of non-INSPIRE datasets and services. There should be a commonly agreed mechanism how to separate the non-INSPIRE data from

Page 6: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

INSPIRE ones. On the other hand, all INSPIRE datasets and services don't (at least not yet) have metadata documents in the national discovery service. Anyway the existence of these datasets and services should also be indicated in the dashboard.

France: Some indicators (about area & use of infrastructure) are not in the metadata.

Germany: Some of the required monitoring information could not be derived from metadata (use of services, relevant and actual area).

UK: We disagree and this is not currently feasible. We disagree because we think Member State National Contact Points should have an opportunity to validate their indicators before they are presented to a dashboard. This is particularly so if it is to be used to monitor legal compliance. It is not feasible because not all indicators can be calculated programatically, and particularly not in a federated SDI (such as that implemented in the UK). (See also responses on indicators).

Slovak Republic: Not all of Slovak obliged organizations has registered their discovery services in INSPIRE registry and not all of them are completely in accordance with requirements of INSPIRE.

Anonymous: not all monitoring info is kept in the metadata (e.g sevice requests, actual/relevant area)Datasets/services with no existing metadata are excluded

D-B 8) Describe the main benefits you perceive from having a dashboard

Analysis of the answers: This question was interpreted and answered as “what is the function/role you perceive/want the dashboard to have.”Perceived Roles functionality:

a) Reduce through dashboard automated – real time processes, MS monitoring and reporting obligations and associated administrative burden.

b) Improved, user friendly, accessible, comparative information regarding progress in implementing INSPIRE which would motivate wider, INSPIRE compliance by public authorities.

c) Facilitated access and search services, to datasets and metadata according to different themes by different countries.

Spain: The statistical knowledge of dataset and services in all country

Cyprus: Monitoring the current status of SDI at any time.

Estonia: Generates automatically reports Will reduce development cost Will help conformity testing

Denmark: The dashboard will be a tool for the member states to inspire them in their affords in identifying data sets in scope of annex 1, 2 and 3. E.g. to answer the question "which data sets has my neighboring countries identified in scope of annex 3 US?"

Italy: easier recovery of information related to INSPIRE Monitoring; full pan-european landscape on INSPIRE monitoring activities easily accessible; help EC to focus on MS weakness or lacksGreece: The main benefit would be the real time information on all the components of the member states' and European infrastructure.

Belgium:- automatic and up-to-date (semi real time) monitoring- use of metadata (+ stimulus for data providers to update their metadata)- link between metadata, data and its services (should be implemented in the dashboard)- user friendly and easily publicly available- clear overview using graphics

Page 7: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Finland: Automatizing the collection of monitoring information saves a great amount of yearly work. Showing an up-to-date status of INSPIRE implementation makes it easier for organisations to see what they still have to do to fulfil INSPIRE requirements.Dashboard gives a more visible and concrete view to the monitoring information and INSPIRE in general.

France: An easier way to show monitoring's results to managers and politics; a way to compare national results with the one of European countries.

Germany: Transparency about the implementation status of INSPIRE in the member states.

The Netherlands: efficiency, collecting the information by hand is a lot of work comparable information, it is collected the same way actual, it is the situation, not the wishSweden: The possibility to get at quick overview of all available datasets and services within different countries would be a major benefit, as well who the data custodian in respective country is. Currently, each separate Excel monitoring sheet have to be downloaded and manipulated in order to get an idea of a specific aspect. It will also give a good overview on how different countries defines their datasets and what data is available from a cross-border perspective.

UK: Greater transparency on the progress in delivering INSPIRE.Greater visability on extent of data published and performance and use of data services - enabling the benefits of these services to be demonstrated. Reduced burden on each MS to provide data if dashboard is implemented correctly, and provided it is combined with changes to monitoring indicators.

Slovak Republic: eliminating manual copying information from metadata reducing the errors in monitoring and reporting graphical user more friendly interface exchange of standards for M&R between member states, clarification of requirements for M&R

Anonymous: simplification of the national monitoring process

D-B 9) Describe the main difficulties you perceive in having a dashboard

Analysis of the answers: The key difficulties described concern feasibility issues regarding the development of the dashboard, due to its reliance on metadataCurrent metadata problems can be summarized as:

a) Some fields required for dashboard and monitoring missing from metadata or currently not mandatoryb) Poor quality metadata (eg: not updated, incomplete, unverifiable, different interpretation of description

fields)c) Absence of metadata for many datasets

Additionally the issue of potential financial burden to MS for the implementation of the dashboard in National geoportals was mentioned as a potential difficulty.

Spain: The lack of feedback the Inspire community It is necessary examples, guidelines, or rules more explain (the numeber of records is very differents in each country for the same theme). For instance: Is a inventary of plants a dataset?

Cyprus: N/A

Estonia: developing user-friendly interface

Denmark: The dashboard shall show member states monitoring-information based on data sets and services metadata. The quality of metadata plays a huge role in this process. The dashboard should not be used as a control tool to check for conformity and progress in each member states INSPIRE implementation - there can be several reasons for why the number of available data sets can increase or decrease. E.g. organizational changes, new data-production-cooperation etc.

Page 8: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Italy: Implementing and management (economically and technically) the dashboard at national scale; difficulties to apply standards for interoperability between National And central node; lack of a national endpoint node

Greece: If I have understood the structure of the dashboard well, one difficulty will be the fact that many of our datasets and services that fall under INSPIRE are not accessible by network or spatial data services, so it would be difficult to have real time information on them.

Belgium:-metadata does not contain all the necessary information for automatic monitoring: (1) some fields required for monitoring are not present (yet) in metadata, (2) metadata is not always kept up-to-date by the data provider, (3) some metadata fields which will be necessary for the dashboard are not mandatory. Data providers will have to complete these metadata fields. (2) and (3) will be difficult since metadata is by most data providers considered as a burden and has low priority.-implementation cost to integrate the dashboard in own geoportal (if wanted), and if wanted to extend it with additional information.

France: To reach consensus on functionnalities (quite easy) and on design (much more difficult).

Germany: The dashboard could lead to a "competition" between the member states. The dashboard could be used as a control instrument by the commission.

The Netherlands: not all indicators can be automatically generated out of the metadata

Sweden: Metadata is not always correct, partly because of human errors entering faulty information, partly due to communication between the national geoportals and the INSPIRE geoportal. This may be a source for erroneous information in the dashboard system. This will, however, eventually be corrected as errors are discovered and data custodians can be enlightened on the errors. Another problem might be to differentiate between INSPIRE datasets and other national datasets (and products).

UK: Getting appropriate data from the metadata records to provide the information, particularly as the quality of metadata is variable, particularly with some of the less mature data publishers. A lack of underpinning standards for the exchange of monitoring information.

Slovak Republic: governance licensing framework manual input still needed

Anonymous: the potential difficulties with the metadata quality (in the initial phase) will lead to better metadata quality later

Page 9: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

D-B 10 - a) Do you think it is necessary to modify any of the existing indicators?

No 2 13%

Yes 13 87%

Other 0 0%

D-B 10 - b) If "yes" on the above question, please indicate which indicator(s) and why.

Poland: DSi1, NSi3,

Denmark: Area indicators are irrelevant - so far their existence in the monitoring is unknown. The indicator for metadata existence is irrelevant if the monitoring is based on metadata. The indicator for metadata compliance to Metadata IR is irrelevant if the monitoring is based on metadata in discovery services. The indicator for data set / service is available in Discovery Service is irrelevant if the monitoring is based on metadata in discovery services.

Italy: DSi1; DSi1.1; DSi1.2; DSi1.3 - not relevantNSi3; NSi3.1; NSi3.2; NSi3.3; NSi3.4; NSi3.5 – comparability issues

Greece: I believe that the relevant and actual area do not give any valuable information on the implementation of the directive. This indicator has a meaning if it is reported for a dataset that has not been completed yet. Since the directive does not set requirements for the collection of data, counting the completion of the dataset might be out of the scope of the directive.

Finland: DSi1 indicator (Geographical coverage of spatial data sets) NSi3 indicator (Use off all network services). Information needed to calculate these indicators needs to be manually collected from data providing organizations.

France: DSi1- NSi3

Germany: The indicators "Use of services" (NSi3) and "Geographical coverage" (DSi1) are quite difficult to calculate and don't say anything about the implementation status of INSPIRE.

The Netherlands: calculation of the actual area and coverage more metadata is needed. At least the administrative unit it covers.the use of the services, it's not in the metadata also not in the new elements described in the SDSS. Use monitored at the EU INSPIRE portal can be an alternative.

Sweden: The information for “relevant area” and “actual area” in particular. The indicator doesn’t provide much information at all. Assume there are 250+ datasets on the list, which I think is a very modest assumption, all being captured to 100 % of its intended coverage. The overall indicator will then also show 100 %. One additional dataset will, for the short time data capture is in progress not change that overall 100 % significantly (20 % captured = 99.7 %, 50 % captured = 99.8 %, etc.).Also the indicator for usage of services need to be considered. This indicator cannot be captured automatically

Page 10: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

from the metadata.Some indicators are also redundant as they obviously exists, for instance will metadata always exist if it is extracted from the INSPIRE geoportal

UK: We think a comprehensive review of the indicators is required. It is clear that several indicators are redundant and that others offer little value to Member States, and indeed we struggle to see how they offer value to the Commission. We think this activity should be prioritised over a dashboard but done with the aim of delivering a dashboard in mind.

Anonymous: I am not sure about the usefulnes of most indicators at all; some are useless at all (e.g. number of services), some are useless after the initial phase until 2015/2016; maybe we need a better definition of some indicators

D-B 11 - a) Should the dashboard be linked to National Geoportals?

Yes 9 56%

No 2 13%

Other 5 31%

Poland: yes, if it is possible

Belgium: National and/or regional portals

The Netherlands: not sure

Anonymous: this should be possible, but not mandatory

D-B 11 - b) Could you elaborate the reasons of your choice in the above question?

Spain: We think it is necesary to link with national catalogue.

Cyprus: Immediate access of local users.

Denmark: The dashboard should be linked to and harvest information from the EU INSPIRE Geoportal.

Italy: yes, but at the moment, it is not easily feasibile because of the heterogeneous status of implementation in public authorities

Greece: I am not sure I understand the actual linkage mentioned but if it means that the user of the national geoportal will be given the technical means to be informed for the content of the dashboard, I think that this might be the correct way to pass the information of the dashboard to the users.

Finland: It seems obvious to do the linking because the dashboard provides information about the status of national SDI implementation.

Page 11: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

France: Save money and time by using national SDI

Germany: Depends on what is meant by "linked".

The Netherlands: what is the benefit of that?

Sweden: The purpose of having a central dashboard is to limit the burden of monitoring by each member state (NCP) but still be able to access monitoring information, i.e. from the result of the monitoring. Theoretically, the INSPIRE geoportal should contain the same information as the national geoportals, which mean that an automatic routine for extraction of monitoring information from the INSPIRE geoportal should be enough

UK: We believe that the majority of the indicators required to monitor INSPIRE can be derived from discovery metadata published through the national geoportal or by automatically querying the network services registered with these portals.

Slovak Republic: Ensure consistency with the other INSPIRE components (metadata, network services, interoperability, data sharing, reporting).

D-B 12) How do you deal with data sets and services that are not described with metadata yet?

Analysis of the answers: This was an important issue dealt with differently using three basic approaches:a) If a dataset or service doesn’t have metadata it is not recorded or considered for inclusion in the

monitoring and reporting.b) Manual recording of metadata information into monitoring excel sheetsc) Purposeful creation of metadata and subsequent inclusion within the monitoring.

Approaches b) and c) where described as time consuming, and transitional as the aim is for all to finally adopt approach a).

Spain: Don't include in monitoring and reporting

Cyprus: Pending issue that needs to be handled ASAP.

Estonia: Metadata is always described before we start dealing with the dataset

Denmark: They are part of the monitoring - their monitoring-information are typed in manually. Their existence in the monitoring indicates progress in the implementation of INSPIRE.

Italy: data providers should describe data sets and services with metadata before publishing

Greece: We collect the relevant monitoring information from the data producers through an excel sheet that contains all the discovery metadata elements along with all the monitoring elements. We fill out the INSPIRE monitoring template manually. The most difficult part, since we do not have actual access to the datasets and services, is that we cannot be sure that the information we collect from the data producers is correct and we cannot be sure that the data actually fall under the scope of the directive.

Belgium: Stimulate data providers to describe their data and services with metadata. Datasets and services which have no metadata will not appear in the dashboard. Consequently, in the beginning, the information shown in the dashboard will differ from the information in the current excel files. It will grow and become more complete as metadata will be completed. The quality and completeness of the information of the dashboard will, for the main part, depend on the quality and completeness of the metadata.

Finland: If we know that a dataset or service exist we include it manually in yearly INSPIRE monitoring information collection sheet. This is quite easy for public authorities.

Page 12: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

For municipalities (in Finland >300) we only include in monitoring those datasets and services that have been described with metadata because we don't know for sure about the existence of non-described data.

France: We do not even know if they exist under digital form, so we think it is impossible to deal with

Germany: Currently they are manually added in Excel. But in the future, if the monitoring information is derived by metadata we have an agreement that only those data sets are included in the monitoring that are already described by metadata and accessible through a discovery service.

The Netherlands: we report on them

Sweden: This is a temporal problem only. As the SDI evolves all data will eventually be published. Data not published are thus assumed non-existing for the time being. When the data custodians are ready to provide the data, they will also publish metadata for the data. The role of the NCP’s are to inform and encourage the data custodians to publish metadata for their data as soon as possible, and also help the data custodians to identify which data are INSPIRE data (the dashboard will serve well in this context).

UK: We only report on datasets and services that are described with metadata, due to the federated nature of our data we would not be aware of INSPIRE data that was not registered.

Slovak Republic: where possible and relevant metadata are created, where not possible information is monitored manually

Anonymous: either we don’t know them or we ask the data owner to fulfil the INSPIRE obligations

D-B 13 - a) Are there only INSPIRE metadata accessible through your national discovery service?

Yes 3 31%

No 13 69%

Page 13: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

D-B 13 - b) If "no" on the above question, how do you distinguish between metadata of INSPIRE data sets or services and other metadata?

Spain: By XML schema

Estonia: Name of the dataset in the metadata includes a word 'INSPIRE'

Denmark: By using keywords. We have instructed the data- and services providers in using a keyword "INSPIRE" to indicated if the data set / service is INSPIRE or not. Our monitoring application filter the metadata using this keyword.

Italy: through a "flag" in the Italian metadata format

Belgium: The metadatasets of INSPIRE datasets and services have the keyword “Lijst M&R INSPIRE”.

Finland: We have a "non-INSPIRE" keyword for non-INSPIRE datasets but this hasn't been a very well working solution.

France: See technical methodology sent to MIWP-5 : in two words, use of GEMET thesaurus in Metadata + analysis of URL for the services

Germany: We use an additional keyword in the metadata called "inspireidentifiziert".

The Netherlands: on a database attribute category INSPIRE, it's not based on something in the metadata. It is set in the portal.

Sweden: In the metadata catalogue, a separate metadata element has been introduced in order to differentiate between INSPIRE data and other data published (not following the INSPIRE specifications but of interest mainly from a national perspective).

UK: Using the appropriate GEMET keywords in the metadata.

Anonymous: we have some OpenData services, most of them additional to INSPIRE services, but not conform

Page 14: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Annex 2

INSPIRE Indicators

-------------------------

MDi1 - a) How is the MDi1 indicator (Existence of Metadata) collected?

a) automated collection through metadata information 4 25%

b) through other automated process 4 25%

c) manually 8 50%

MDi1 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

Anonymous: question is not clear: data are collected manually, indicator is computed in the EXCEL-file

MDi1 - c) Please provide comments regarding the indicator MDi1Spain: We don´t check it

Cyprus: Delay in collecting the information.

Greece: There is no way to make sure that the information we collect manually is real.

Belgium: At this moment, the information Existence of Metadata is collected manually using the monitoring Excel file which is distributed to the data providers. When retrieving the main part of the monitoring information through the CSW, this indicator will become obsolete (will always be 100%).

Finland: Information on non-existing metadata can't be collected automatically.

France: easy : by construction, the result is 100% as we check that every known datasets were in the FR SDI

Page 15: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Sweden: As only metadata published in the national geoportal is considered, this indicator is indeed redundant, especially from an automatic dashboard perspective. In Sweden, all data custodians are required by law to publish metadata for their data and services in the national geoportal. This means that this will always be 100 %

UK: We feel this indicator is not useful, in our case it is always 1 as we can only report a dataset or service if metadata exists due to the nature of our SDI.

Slovak republic: All indicators are collected via webform into database from which xls report is exported based on monitoring template. Anyway after export there is still need for manual check and update.Anonymous: not useful in future; you ask for the conformance to legal requirements - I am not sure if this makes sense

MDi1 - d) Should the indicator MDi1 be included in the dashboard?

a) yes 10 67%

b) no 5 33%

c) Yes under certain conditions 0 0%

Other 0 0%

MDi1,1/2/3 - a) How are the MDi1,1/2/3 indicators (Existence of Metadata for annexes I,II and III) collected?

a) automated collection through metadata information 4 27%

b) through other automated process 3 20%

c) manually 8 53%

Page 16: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi1,1/2/3 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

Anonymous: question is not clear: data are collected manually, indicator is computed in the EXCEL-file

MDi1,1/2/3 - c) Please provide comments regarding the indicators MDi1,1/2/3

Spain: We don´t check it

Cyprus: Serious delay in collection. Useful information.

Belgium: At this moment, the information Existence of Metadata is collected manually using the monitoring Excel file which is distributed to the data providers. When retrieving the main part of the monitoring information through the CSW, this indicator will become obsolete (will always be 100%).

France: easy to compute as we use the validator integrated in the national geocatalogue, but strongly difficult to understand as one can affect a dataset to many themes. Furthermore, many producers don't understand to which theme the dataset belongs

Sweden: The same comments as for the question above [As only metadata published in the national geoportal is considered, this indicator is indeed redundant, especially from an automatic dashboard perspective. In Sweden, all data custodians are required by law to publish metadata for their data and services in the national geoportal. This means that this will always be 100 %]

Anonymous: not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense

MDi1,1/2/3 - d) Should the indicators MDi1,1/2/3 be included in the dash board?

a) yes 9 60%

b) no 4 27%

c) Yes under certain conditions 2 13%

Other 0 0%

Page 17: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi1,4 - a) How is the MDi1,4 indicator (Existence of Metadata for spatial data services) collected?

a) automated collection through metadata information 3 20%

b) through other automated process 4 27%

c) manually 8 53%

MDi1,4 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EU monitoring spreadsheet. This provides the indicator.

Anonymous: see above [not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense]

MDi1,4 - c) Please provide comments regarding the indicator MDi1,4

Spain: We don´t check it

Cyprus: Useful. Rather hard to collect.

Belgium: At this moment, the information Existence of Metadata is collected manually using the monitoring Excel file which is distributed to the data providers. When retrieving the main part of the monitoring information through the CSW, this indicator will become obsolete (will always be 100%).

Finland: Information on non-existing metadata can't be collected automatically.

Sweden: The same comments as for the question above [As only metadata published in the national geoportal is considered, this indicator is indeed redundant, especially from an automatic dashboard perspective. In Sweden, all data custodians are required by law to publish metadata for their data and services in the national geoportal. This means that this will always be 100 %]

Anonymous: see above [not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense]

Page 18: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi1,4 - d) Should the indicators MDi1,4 be included in the dash board?

a) yes 10 71%

b) no 3 21%

c) Yes under certain conditions 1 7%

Other 0 0%

MDi2 - a) How is the MDi2 indicator (Conformity of metadata) collected?

a) automated collection through metadata information 3 21%

b) through other automated process 3 21%

c) manually 8 57%

MDi2 - b) If "b" in the previous question, could you describe the process?

Poland: The information is taken from our National Geoportal - validated files

Finland: Metadata validity is tested one by one with EU commission validator.

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: We fill in the field in the excel spreadsheet manually.

Page 19: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi2 - c) Please provide comments regarding the indicator MDi2

Spain: We don´t check it

Cyprus: Useful. Rather hard to collect.

France: easy to compute as we use the validator integrated in the national geocatalogue

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: Metadata is validated before published in the national geoportal which means that the metadata published are always according to the INSPIRE requirements

UK: Metadata is only harvested to our SDI if it passes strict validation. Only records within our SDI are reported in the monitoring report, so the answer to this indicator is always 1.

MDi2 - d) Should the indicator MDi2 be included in the dash board?

a) yes 10 67%

b) no 3 20%

c) Yes under certain conditions 2 13%

Other 0 0%

MDi2,1/2/3 - a) How are the MDi2,1/2/3 indicators (Conformity of metadata for annexes I,II and III) collected?

a) automated collection through metadata information 4 29%

b) through other automated process 2 14%

c) manually 8 57%

Page 20: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi2,1/2/3 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: We fill in the field in the excel spreadsheet manually.

MDi2,1/2/3 - c) Please provide comments regarding the indicators MDi2,1/2/3

Spain: We don´t check it

Cyprus: Useful. Time consuming collection.

France: Same issue than for MDi1 1/2/3 [easy to compute as we use the validator integrated in the national geocatalogue, but strongly difficult to understand as one can affect a dataset to many themes. Furthermore, many producers don't understand to which theme the dataset belongs]

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The same comments as for the question above [Metadata is validated before published in the national geoportal which means that the metadata published are always according to the INSPIRE requirements]

UK: Metadata is only harvested to our SDI if it passes strict validation. Only records within our SDI are reported in the monitoring report so the answer to this indicator is always 1.

MDi2,1/2/3 - d) Should the indicators MDi2,1/2/3 be included in the dash board?

a) yes 9 53%

b) no 4 27%

c) Yes under certain conditions 2 20%

Other 0 0%

Page 21: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi2,4 - a) How is the MDi2,4 indicator (Conformity of metadata for spatial data services) collected?

a) automated collection through metadata information 3 21%

b) through other automated process 3 21%

c) manually 8 57%

MDi2,4 - b) If "b" in the previous question, could you describe the process?

Finland: Metadata validity is tested one by one with EU commission validator.

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: We fill in the field in the excel spreadsheet manually.

MDi2,4 - c) Please provide comments regarding the indicator MDi2,4

Spain: We don´t check it

Cyprus: Time consuming.

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The same comments as for the question above [Metadata is validated before published in the national geoportal which means that the metadata published are always according to the INSPIRE requirements]

UK: Metadata is only harvested to our SDI if it passes strict validation, as only records within our SDI are reported in the monitoring report the answer to this indicator is always 1.

Page 22: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

MDi2,4 - d) Should the indicators MDi2,4 be included in the dash board?

a) yes 10 67%

b) no 3 20%

c) Yes under certain conditions 2 13%

Other 0 0%

DSi1 - a) How is the DSi1 indicator (Geographical coverage of spatial data sets) collected?

a) automated collection through metadata information 1 6%

b) through other automated process 2 13%

c) manually 13 81%

DSi1 - b) If "b" in the previous question, could you describe the process?

Anonymous: see above (not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense)

DSi1 - c) Please provide comments regarding the indicator DSi1

Poland: difficult in collecting

Spain: We think, there will be a list of official geographical coverage. For instance the list of Eurostat surface

Cyprus: Useful. Time consuming.

Page 23: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Italy: time consuming and not relevant

Finland: This information is hard to collect. Possibly not very useful information because INSPIRE itself doesn't require collecting new data.

France: impossible to get it at a reasonable cost. We never gave this indicator as we do not know how to get it.

Germany: The data providers have difficulties in understanding the indicator, especially what is meant by actual and relevant area. The required information to calculate the indicator can't be derived from metadata. This indicator is not very feasible to report the implementation status.

The Netherlands: it can be calculated if in the metadata the administrative unit it covers is added

Sweden: An extra metadata element has been included in the national metadata catalogue. The data custodians are required to fill this information in. This makes it easy to extract this information from the catalogue. See also the comment under D-B 10 [The information for “relevant area” and “actual area” in particular. The indicator doesn’t provide much information at all. Assume there are 250+ datasets on the list, which I think is a very modest assumption, all being captured to 100 % of its intended coverage. The overall indicator will then also show 100 %. One additional dataset will, for the short time data capture is in progress not change that overall 100 % significantly (20 % captured = 99.7 %, 50 % captured = 99.8 %, etc.). Also the indicator for usage of services need to be considered. This indicator cannot be captured automatically from the metadata. Some indicators are also redundant as they obviously exists, for instance will metadata always exist if it is extracted from the INSPIRE geoportal]

UK: This indicator is problematic for us as the only way we can collect it is to request our data providers provide it manually - and then we add it to the monitoring return manually. In previous years this hasn't been problematic as we have had low numbers of and experienced data publishers. However with the publication over a large quantity of aII data from a large number of less mature data publishers in 2013 this has become exceptionally burdensome. We are not able to provide this indicator in 2014.

Anonymous: indicator is usually 100%

DSi1 - d) Should the indicator DSi1 be included in the dash board?

a) yes 5 33%

b) no 8 53%

c) Yes under certain conditions 1 7%

Other 1 7%

Page 24: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

DSi1,1/2/3 - a) How are the DSi1,1/2/3 indicators (Geographical coverage of spatial data sets) collected?

a) automated collection through metadata information 1 7%

b) through other automated process 1 7%

c) manually 13 87%

DSi1,1/2/3 - b) If "b" in the previous question, could you describe the process?

France: impossible to get it at a reasonable cost. We never gave this indicator as we do not know how to get it.

DSi1,1/2/3 - c) Please provide comments regarding the indicator DSi1

Spain: We don´t check it

Cyprus: Useful. Time consuming.

Italy: time consuming and not relevant

Belgium: The value "Geographical coverage of spatial data sets" is useful and interesting for an individual dataset. You can see if a dataset is complete, or under construction and you can see the progress over the years. The surplus value of the indicator however, is not clear. What do we learn from this indicator? What is the purpose?

Finland: This information is hard to collected. Possibly not very useful information because INSPIRE itself doesn't require collecting new data.

France: See above [impossible to get it at a reasonable cost. We never gave this indicator as we do not know how to get it]

Germany: The data providers have difficulties in understanding the indicator, especially what is meant by actual and relevant area. The required information to calculate the indicator can't be derived from metadata. This indicator is not very feasible to report the implementation status.

Sweden: The same comments as for the question above [An extra metadata element has been included in the national metadata catalogue. The data custodians are required to fill this information in. This makes it easy to extract this information from the catalogue. See also the comment under D-B 10 (The information for “relevant area” and “actual area” in particular. The indicator doesn’t provide much information at all. Assume there are 250+ datasets on the list, which I think is a very modest assumption, all being captured to 100 % of its intended coverage. The overall indicator will then also show 100 %. One additional dataset will, for the short time data capture is in progress not change that overall 100 % significantly (20 % captured = 99.7 %, 50 % captured = 99.8 %, etc.). Also

Page 25: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

the indicator for usage of services need to be considered. This indicator cannot be captured automatically from the metadata. Some indicators are also redundant as they obviously exists, for instance will metadata always exist if it is extracted from the INSPIRE geoportal)]

UK: This indicator is problematic for us as the only way we can collect it is to request our data providers provide it manually - and then we add it to the monitoring return manually. In previous years this hasn't been problematic as we have had low numbers of and experienced data publishers.However with the publication over a large quantity of aII data from a large number of less mature data publishers in 2013 this has become exceptionally burdensome. We are not able to provide this indicator in 2014.

DSi1,1/2/3 - d) Should the indicators DSi1,1/2/3 be included in the dash board?

a) yes 4 27%

b) no 8 53%

c) Yes under certain conditions 1 7%

Other 2 13%

DSi2 - a) How is the DSi2 indicator (Conformity of spatial data sets) collected?

a) automated collection through metadata information 3 19%

b) through other automated process 3 19%

c) manually 10 63%

Page 26: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

DSi2 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

Anonymous: question is not clear: data are collected manually, indicator is computed in the EXCEL-file

DSi2 - c) Please provide comments regarding the indicator DSi2

Spain: We don´t check it

Cyprus: Useful. Time consuming.

Belgium: All MS should use the same validation tools, which should be provided by the EC. It is discussable if only Y/N should appear in the dashboard per dataset/service, or maybe a percentage (e.g. 90% conform). Of course, this depends on the information you get back from the validation tools. Ideally, the dashboard should be connected with the validation tools, and perform a validation test on request or automatically on predefined times. Besides that, the validation tools should evidently be available ‘off line’ (meaning disconnected from the dashboard) for testing.

Finland: Conformity information is included in the metadata and is easily available there but this information can be provided without doing any validation on the actual data. The lack of validation perhaps makes this information a bit unreliable.

France: This indicator is under the producer's responsibilityThe Netherland: we prefer one EU validation tool used by each MS the monitoring of this indicator shoul be based on this

Sweden: The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though

UK: We have not provided this indicator yet as we have not considered conformity of data. We would anticipate we would extract this from discovery metadata.Anonymous: the only really useful indicator until 2020

DSi2 - d) Should the indicator DSi2 be included in the dash board?

a) yes 13 81%

b) no 0 0%

c) Yes under certain conditions 3 19%

Other 0 0%

Page 27: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

DSi2,1/1/3 - a) How are the DSi2,1/1/3 indicators (Conformity of spatial data sets) collected?

a) automated collection through metadata information 4 27%

b) through other automated process 2 13%

c) manually 9 60%

DSi2,1/2/3 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

DSi2,1/2/3 - c) Please provide comments regarding the indicators DSi2,1/1/3

Spain: We don´t check it

Cyprus: Useful. Time consuming.

Finland: Conformity information is included in the metadata and is easily available there but this information can be provided without doing any validation on the actual data. The lack of validation perhaps makes this information a bit unreliable.

France: Idem [This indicator is under the producer's responsibility]The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

UK: We have not provided this indicator yet as we have not considered conformity of data. We would anticipate we would extract this from discovery metadata.

Page 28: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

DSi2,1/1/3 - d) Should the indicators DSi2,1/1/3 be included in the dash board?

a) yes 13 81%

b) no 0 0%

c) Yes under certain conditions 3 19%

Other 0 0%

NSi1 - a) How is the NSi1 indicator (Accessibility of metadata through discovery services) collected?

a) automated collection through metadata information 4 25%

b) through other automated process 3 19%

c) manually 9 56%

NSi1 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: We fill in the field in the excel spreadsheet manually.

Anonymous: see above [the only really useful indicator until 2020]

NSi1 - c) Please provide comments regarding the indicator NSi1

Spain: We don´t check it

Cyprus: Useful. Time consuming.

Belgium: This indicator will become obsolete if dashboard retrieves information via the discovery services.

Page 29: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Finland: This indicator is overlapping with MDi1 but both might be useful still.

France: Idem MDi1 [easy : by construction, the result is 100% as we check that every known datasets were in the FR SDI]

The Netherlands: if the monitoring is based on the content of the discovery service this indicator is not needed; if it exists in the discovery it is in the monitoring, otherwise not

Sweden: As the information is derived from the national metadata catalogue, it is automatically derived.

UK: Only records within our SDI central catalogue are reported in the monitoring report the answer to this indicator is always 1

Anonymous: not useful in future. you ask for the conformance to legal requirements - I am not sure if this makes sense

NSi1 - d) Should the indicator NSi1 be included in the dash board?

a) yes 8 53%

b) no 4 27%

c) Yes under certain conditions 3 20%

Other 0 0%

NSi1,1 - a) How is the NSi1,1 indicator (Accessibility of metadata through discovery services - possibility to search for spatial data set) collected?

a) automated collection through metadata information 4 27%

b) through other automated process 2 13%

c) manually 9 60%

Page 30: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi1,1 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: We fill in the field in the excel spreadsheet manually

NSi1,1 - c) Please provide comments regarding the indicator NSi1,1

Spain: We don´t check it

Cyprus: Useful. Time consuming.

Finland: Information on non-existing metadata can't be collected automatically.

France: Idem MDi1 [easy : by construction, the result is 100% as we check that every known datasets were in the FR SDI]

The Netherland: this is always the case; the discovery service is harvested in the EU INSPIRE portal

Sweden: The same comments as for the question above [As the information is derived from the national metadata catalogue, it is automatically derived]

UK: Because all metadata included in our monitoring return is drawn from our SDI's central catalogue, our answer for this is always 1

NSi1,1 - d) Should the indicator NSi1,1 be included in the dash board?

a) yes 9 60%

b) no 4 27%

c) Yes under certain conditions 2 13%

Other 0 0%

Page 31: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi1,2 - a) How is the NSi1,2 indicator (Accessibility of metadata through discovery services - possibility to search for spatial data services) collected?

a) automated collection through metadata information 4 29%

b) through other automated process 2 14%

c) manually 8 57%

NSi1,2 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: We fill in the field in the excel spreadsheet manually.

NSi1,2 - c) Please provide comments regarding the indicator NSi1,2

Spain: We don´t check it

Cyprus: Useful.

Finland: Information on non-existing metadata can't be collected automatically.

France: no issue

The Netherlands: this is always the case; the discovery service is harvested in the EU INSPIRE portal

Sweden: The same comments as for the question above [As the information is derived from the national metadata catalogue, it is automatically derived]

UK: Because all metadata included in our monitoring return is drawn from our SDI's central catalogue, our answer for this is always 1.

Page 32: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi1,2 - d) Should the indicator NSi1,2 be included in the dash board?

a) yes 9 60%

b) no 4 27%

c) Yes under certain conditions 2 13%

Other 0 0%

NSi2 - a) How is the NSi2 indicator (Accessibility of spatial data set through view and download services) collected?

a) automated collection through metadata information 1 7%

b) through other automated process 2 13%

c) manually 12 80%

NSi2 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

Page 33: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi2 - c) Please provide comments regarding the indicator NSi2

Cyprus: Useful.

Denmark: A link or some kind of connection information between data set and service (view and download) is missing and could be considered as part of the monitoring: There is now no connection between data set and its services - in other words: It is not possible to see where a data set is available and visible/view-able. In some cases the service provider naming their service so it is recognizable which data set the service provide but as the monitoring is now it is only the theme and annex that is indicated.

Belgium: Guidelines for MD will be necessary to enable automatic retrieval of this information via the MD (how to complete the 'online resources' fields?). The catalogue in the Flemish Geoportal http://www.geopunt.be/catalogus retrieves this information automatically from the MD. When you select a dataset in the catalogue, the button ‘bekijk op kaart’ (= view) and ‘download’ are directly connected with the MD of that datasets. When the online resources are filled in ‘correctly’ in the MD record, the buttons the Geopunt catalogue are activated automatically.

Finland: If service metadata doesn't exist or it's not complete this information has to be manually collected.

France: How to find Simple download services (ATOM or http/GET)? We check the URL syntax (in order to find most of the download services under Atom or other mode) but we obviously miss some.

Sweden: This is not done at the moment but can be deduced from the “Coupled resource” metadata element for services in the metadata implementing rule (1.6)Requires the functionality to be implemented in the Swedish Geodataportal which is under discussion. If the information are to be captured from the INSPIRE geoportal, this should be implemented there.

NSi2 - d) Should the indicator NSi2 be included in the dash board?

a) yes 13 87%

b) no 0 0%

c) Yes under certain conditions 2 13%

Other 0 0%

Page 34: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi2,1 - a) How is the NSi2,1 indicator (Accessibility of spatial data set through view services) collected?

a) automated collection through metadata information 3 21%

b) through other automated process 2 14%

c) manually 9 64%

NSi2,1 - b) If "b" in the previous question, could you describe the process?

Spain: With a validation tool. We exam the getCapabilities file

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

NSi2,1 - c) Please provide comments regarding the indicator NSi2,1

Cyprus: Useful.

Finland: If service metadata doesn't exist or it's not complete this information has to be manually collected.

France: No issue

Sweden: The same comments as for the question above [This is not done at the moment but can be deduced from the “Coupled resource” metadata element for services in the metadata implementing rule (1.6) Requires the functionality to be implemented in the Swedish Geodataportal which is under discussion. If the information are to be captured from the INSPIRE geoportal, this should be implemented there]

Page 35: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi2,1 - d) Should the indicator NSi2,1 be included in the dash board?

a) yes 13 87%

b) no 0 0%

c) Yes under certain conditions 2 13%

Other 0 0%

NSi2,2 - a) How is the NSi2,2 indicator (Accessibility of spatial data set through download services) collected?

a) automated collection through metadata information 2 14%

b) through other automated process 2 14%

c) manually 10 71%

NSi2,2 - b) If "b" in the previous question, could you describe the process?

Spain: With a validation tool. We exam the getCapabilities file

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

Page 36: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi2,2 - c) Please provide comments regarding the indicator NSi2,2

Cyprus: Useful.

Finland: If service metadata doesn't exist or it's not complete this information has to be manually collected.

France: How to find Simple download services (ATOM or http/GET)? We check the URL syntax (in order to find most of the download services under Atom or other mode) but we obviously miss some.

Sweden: The same comments as for the question above [This is not done at the moment but can be deduced from the “Coupled resource” metadata element for services in the metadata implementing rule (1.6) Requires the functionality to be implemented in the Swedish Geodataportal which is under discussion. If the information are to be captured from the INSPIRE geoportal, this should be implemented there]

NSi2,2 - d) Should the indicator NSi2,2 be included in the dash board?

a) yes 13 87%

b) no 0 0%

c) Yes under certain conditions 2 13%

Other 0 0%

NSi3 - a) How is the NSi3 indicator (Use off all network services) collected?

a) automatically 1 7%

b) manually 13 93%

Page 37: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi3 - b) Please provide comments regarding the indicator NSi3

Poland: very hard to collect, no such mechanism available (specially with the period one year)

Cyprus: Useful.

Italy: Comparability issues between data providers

Greece: There might be a need to better define the "use". Is it the number of unique requests, the number of the unique visitors or the number of other applications that use the service? I am not sure that the service providers are sure about the number they (manually) provide.The monitoring of the actual uses of a service by other applications and portals might be more useful for the documentation of the benefits and the added value of an INSPIRE service. I thing that the benefits and the added value is the actual meaning of that indicator.

Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers.

France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable

Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well.

The Netherlands: is an alternative possible ; the use via the EU portal?

Sweden: A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring.

UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden.We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

Anonymous: number of service requests should be better defined (number of layers per service, ...)

NSi3 - c) Should the indicator NSi3 be included in the dash board?

a) yes 6 38%

b) no 4 25%

c) Yes under certain conditions 5 31%

Other 1 6%

Page 38: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi3,1 - a) How is the NSi3,1 indicator (Use off discovery services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,1 - b) Please provide comments regarding the indicator NSi3,1

Spain: It is necessary to define the "use" term, for instance is a number of visit or number getrecords request,......

Cyprus: Useful.

Italy: Comparability issues between data providers

Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service provider.

France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable

Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well.

The Netherlands: is an alternative possible ; the use via the EU portal?

Sweden: The same comments as for the question above [A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring]

NSi3,1 - c) Should the indicator NSi3,1 be included in the dash board?

a) yes 5 36%

b) no 5 36%

c) Yes under certain conditions 4 29%

Other 0 0%

Page 39: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi3,2 - a) How is the NSi3,2 indicator (Use off view services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,2 - b) Please provide comments regarding the indicator NSi3,2

Spain: It is necessary to define the "use" term, for instance is a number of visit or number getmap request,......

Cyprus: Useful.

Italy: Comparability issues between data providers

Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers.

France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable

Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well.

The Netherlands: is an alternative possible ; the use via the EU portal?

Sweden: The same comments as for the question above [A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring]

UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden.We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

Page 40: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi3,2 - c) Should the indicator NSi3,2 be included in the dash board?

a) yes 5 33%

b) no 5 33%

c) Yes under certain conditions 4 27%

Other 1 7%

NSi3,3 - a) How is the NSi3,3 indicator (Use off download services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,3 - b) Please provide comments regarding the indicator NSi3,3

Spain: It is necesary to define the "use" term, for instance is a number of visit or number getfeature request,......

Cyprus: Useful.

Italy: Comparability issues between data providers

Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers.

France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable

Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well.

Page 41: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

The Netherlands: is an alternative possible ; the use via the EU portal?

Sweden: The same comments as for the question above [A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring]

UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden.We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

NSi3,3 - c) Should the indicator NSi3,3 be included in the dash board?

a) yes 5 33%

b) no 5 33%

c) Yes under certain conditions 4 27%

Other 1 7%

NSi3,4 - a) How is the NSi3,4 indicator (Use off transformation services) collected?

a) automatically 1 7%

b) manually 13 93%

Page 42: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi3,4 - b) Please provide comments regarding the indicator NSi3,4

Cyprus: Useful.

Italy: Comparability issues between data providers

Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers.

France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable

Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well.

The Netherlands: is an alternative possible ; the use via the EU portal?

Sweden: Same as above although so far there are not transformation services listed. What would be the measure – number of transformations done?

UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden.We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

NSi3,4 - c) Should the indicator NSi3,4 be included in the dash board?

a) yes 4 27%

b) no 6 40%

c) Yes under certain conditions 3 20%

Other 2 13%

Page 43: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi3,5 - a) How is the NSi3,5 indicator (Use off invoke services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,5 - c) Should the indicator NSi3,5 be included in the dash board?

a) yes 5 33%

b) no 6 40%

c) Yes under certain conditions 3 20%

Other 1 7%

NSi4 - a) How is the NSi4 indicator (Conformity of all services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4 - b) Please provide comments regarding the indicator NSi4

Page 44: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Spain: It is necessary automatic validation tools

Cyprus: Useful.

Belgium: Please see comments earlier on common validation tools [monitoring information including validation results (Y/N or even more detailed?) IF all MS use the same validation tools which should be provided by the EC]

Finland: EU commission validator can't validate services that require authentication.

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, thoughUK: We'd anticipate that this information could be derived from the discovery metadata although quality issues remain.

NSi4 - c) Should the indicator NSi4 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,1 - a) How is the NSi4,1 indicator (Conformity of network services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4,1 - b) Please provide comments regarding the indicator NSi4,1

Page 45: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Spain: It is necessary automatic validation tools

Cyprus: Useful.

Finland: EU commission validator can't validate services that require authentication.

France: No issue as long we use only one validator. The problems come if French public authorities use other validators.

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

NSi4,1 - c) Should the indicator NSi4,1 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,2 - a) How is the NSi4,2 indicator (Conformity of view services) collected?

a) automatically 3 20%

b) manually 12 80%

Page 46: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi4,2 - b) Please provide comments regarding the indicator NSi4,2

Spain: It is necessary automatic validation tools

Cyprus: Useful.

Finland: EU commission validator can't validate services that require authentication.

France: idem [No issue as long we use only one validator. The problems come if French public authorities use other validators.]

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

NSi4,2 - c) Should the indicator NSi4,2 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

Page 47: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi4,3 - a) How is the NSi4,3 indicator (Conformity of download services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4,3 - b) Please provide comments regarding the indicator NSi4,3

Spain: It is necessary automatic validation tools

Cyprus: Useful.

Finland: EU commission validator can't validate services that require authentication.

France: Idem [No issue as long we use only one validator. The problems come if French public authorities use other validators.]

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

The Netherland: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

Page 48: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi4,3 - c) Should the indicator NSi4,3 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,4 - a) How is the NSi4,4 indicator (Conformity of transformation services) collected?

a) automatically 3 21%

b) manually 11 79%

NSi4,4 - b) Please provide comments regarding the indicator NSi4,4

Spain: It is necessary automatic validation tools

Cyprus: Useful.

Finland: EU commission validator can't validate services transformation services.

France: In fact, we have no transformation service and I do not know if we have a validator.

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: There are no services listed but if there was they would be collected automatically

Page 49: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi4,4 - c) Should the indicator NSi4,4 be included in the dash board?

a) yes 11 73%

b) no 3 20%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,5 - a) How is the NSi4,5 indicator (Conformity of invoke services) collected?

a) automatically 2 14%

b) manually 12 86%

NSi4,5 - b) Please provide comments regarding the indicator NSi4,5

Spain: It is necessary automatic validation tools

Cyprus: Useful.

Finland: EU commission validator can't validate invoke services.

France: Idem that NSi4.4, with the higher difficulty that we have not IR about invoke services.

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this

Sweden: Currently there are not services listed and with the new amendment from the SDS/Invoke, this becomes tricky as there are no yes or no to the question.

Page 50: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

NSi4,5 - c) Should the indicator NSi4,5 be included in the dash board?

a) yes 11 73%

b) no 2 13%

c) Yes under certain conditions 2 13%

Other 0 0%

Page 51: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Annex 3

INSPIRE Reporting

-------------------------

Rep 1) Should information on 'coordination and quality assurance' be included in the dash board?

a) yes 4 25%

b) no 9 56%

Other 3 19%

Italy: Not sure if it will help

Greece: Not sure how this information might be collected. If there are indicators that can measure such information, then yes.

Slovak Republic: With the guidance, what kind of information is expected

Rep 2) Should information on 'contribution to the functioning and coordination of the infrastructure' be included in the dash board?

a) yes 5 31%

b) no 9 56%

Other 2 13%

Italy: Not sure if it will help

Greece: Not sure how this information might be collected. If there are indicators that can measure such information, then yes.

Page 52: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Rep 3) Should information on 'use of the infrastructure for spatial information' be included in the dash board?

a) yes 6 38%

b) no 9 56%

Other 1 6%

Italy: Not sure if it will help

Rep 4) Should information on 'data sharing arrangements' be included in the dash board?

a) yes 4 29%

b) no 9 64%

Other 1 7%

Italy: Not sure if it will help

Rep 5) Should information on 'cost benefit aspects' be included in the dash board?

a) yes 6 40%

b) no 9 60%

Other 0 0%

Page 53: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Rep 6 - a) Do you collect any additional information during the monitoring process that is not mandated by the Decision on INSPIRE monitoring and reporting?

a) yes 6 38%

b) no 10 63%

Rep 6 - b) If "yes" on the above question, which additional information is collected?

Greece: The transposition law (No. 3882 of 2010) requires the collection of information on software, hardware, licences that relate in some way to spatial information. There has been an attempt to collect this information, but the fact that the collection was manual does not allow us today to use and update the information collected.

Belgium: For Flanders: Link between dataset and its services (via metadata).

Finland: Information on dataset and service licensing and user rights.

Germany:- Information about the administrative level (federal government, states, municipalities, ...) of the submitting organisation.- fileIdentifier of metadata set- service end point- linkage between data set and view and download service- additional comments

Slovak Republic: Urls for metadata, network services

Page 54: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Annex 4

INSPIRE General Information

-------------------------

Inf 1) Is the status of a) INSPIRE content and b) status of implementation presented to eGovernment community?

Not foreseen 3 20%

Currently presented 8 53%

Foreseen 4 27%

Inf 2) About INSPIRE Monitoring information, is the communication with domain specific networks triggered?

Not foreseen 6 40%

Currently active 6 40%

Foreseen 3 20%

Page 55: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Inf 3) About INSPIRE Monitoring information, are there investigations initiatives in synergy with Open Data ongoing?

Not foreseen yet 5 33%

Currently ongoing 7 47%

Foreseen 3 20%

Inf 4) About INSPIRE Monitoring information, are there activities targeting the awareness rising and motivation of stakeholders in place?

Not foreseen yet 5 33%

Currently in place 8 53%

Foreseen 2 13%

Inf 5) About INSPIRE Monitoring information, is the support for governance processes in place?

Not foreseen yet 7 47%

Currently in place 5 33%

Foreseen 3 20%

Page 56: ies-svn.jrc.ec.   Web viewIf I have understood the structure of the ... the metadata includes a word ... webform into database from which xls report is exported based

Inf 6) Are feedback collected on INSPIRE coordinating processes?

Not foreseen yet 5 33%

Currently collected 8 53%

Foreseen 2 13%

Inf 7) Are feedback collected on INSPIRE implementation processes?

Not foreseen yet 5 33%

Currently collected 7 47%

Foreseen 3 20%