LPISQA Legacy/2011 Workshop Amsterdam

From Wikicap - European Commission

This document contains the positions of the Commission services on the issues identified by the working groups during the 2011 LPIS workshop, held at Amsterdam, April 6th-8th . It holds 2 parts

  1. the individual Q&A, in sequence of the presentation by the Commission services staff during the workshop closing session.
  2. an inventory of issues and suggestions identified by the working groups with an index to the relevant Q&A topic.

These positions apply for the 2010 LPIS QA implementations and serve the discussion on the review of the 2011 LPIS QA exercise.

No substantial changes to version 4.3 annexes or corresponding guidelines have been implemented.

The Commission services may revise their positions upon further findings, experiences, results and screening of the 2010 LPIS QA implementation.

Ispra, Brussels, 15/5/2011

Contents

Q&A

PRACTICAL PROBLEMS

Topic 1.1 Imagery

Delegations commented on the difference in quality used for LPIS creation and LPIS QAF. These "quality" issues included: the "scale" of the imagery; the resolution, and the timing.

The Commission services are looking into the following options:

  • Purchasing imagery only of a very high quality
  • MS could buy better imagery if they see fit

In the above it is paramount that no deterioration of the LPIS quality is allowed.

Furthermore, in contrast to aerial orthoimagery, JRC found that the quality of the CwRS VHR orthoimagery is very much dependent on the ortho-production process and its use of ancillary data (GCPs, DEM) over which the producer has often not direct control. The image content can be seriously downgraded, if an inappropriate ortho-production process or irrelevant ancillary data are used. Often too little attention is given to radiometric quality, colour balance and the preservation of the image detail, as the focus lays only on geometric quality.

Unfortunately, there are to date no clear and standardized metrics in respect to the quality check of the radiometry, but to fill this void, JRC already revised its Guidelines for Best Practice and Quality Checking of Ortho Imagery and theOrthoimage technical specifications for the purpose of LPIS. In addition, some possible metrics to assess the relative geometric accuracy, as residual plots and visual inspection of the spatial fit between the vector and raster data, were introduced in the same documentation. These guidelines and specifications will be further revised in the light of the findings from the “screening” of the 2010 ETS results, so that specific, more stringent, requirements can be proposed for the orthoimagery used in the LPIS QA. Some recommendations in that respect were already given on the Wiki article Use of Orthoimagery.

Additional article will be created in the "Support" section of the LPIS QA Documentation regarding the quality of orthoimagery, where some clarifications will be given on the following issues:

  • Influence of the input image parameters on the CAPI for ETS
    • viewing angle
    • time of acquisition
    • type of image product
    • radiometry (bit depth)
    • ground sampling distance
  • Manual photointerpretation of land cover
    • Operator subjectivity
    • Impact of phenological development on interpretation
    • Use of ancillary data (multi-temporal data)
    • Visual scale for digitalization
    • Others


For 2010, and of course for 2011, we strongly advise the Member State to evaluate whether its orthoimagery meets the above orthoimagery specification. Sub-optimal image processing by the contractor should be addressed immediately.

Topic 1.2 Tolerance (on small parcels)

Delegations expressed the need for tolerances for small parcel, large parcels, long parcels etc.

A technical tolerance is an expression of measurement variability. This variability has been taken into account in the tiered threshold (3% - 5% - 7%) and -to some extend- in the probability that underlies the LQ statistics.

The Commission services do not consider introduction of new / other tolerance being good practice.

Indeed, the tolerance prevents analysis of the field situation by vindicating (“hiding”) parcels whose shape and size prevent accurate and precise measurement. Although a given LPIS might well be the most appropriate design for the prevailing conditions, it is nevertheless essential to become aware of these conditions and the implications of the choices made in the LPIS. The Commission Service is looking into the following options and awaits supporting evindence of screening of the ETS packages.

For QE2 this means introducing an additional scoreboard entry that doesn't take into account parcels smaller than a threshold size to be determined.

Topic 1.3 Timeframe for reporting

Delegations expressed the view that by the time the report for year N is finished the evaluation for year N+1 started and not all actions taken after year N give positive results for year N+1.

The Commission services understand the situation. However analyses showed that changing the timing of the work as such is not very easy given the deadlines to start and do the work. There are no real alternatives.

As regards the fact that efforts have not passed, this issue is inherent to the yearly exercise. In the evaluation of the LPIS QAF the longer time aspect of remedial action is "accounted for" by the Commission services.

Topic 1.4 ETS documentation received mixed feedback as regards its "quality"

Some delegations say the ETS documentation is too detailed; others that it is incomplete an or needs to be clarified

The Commission services will consider the issues that need "deletion" – therefore the delegations help could be welcome i.e. they indicate what can go out and what should come in.

  • A tutorial to the setup and structure of the documentation can be found here
  • JRC will revise and prepare a plan for a new version v5.0 of the ETS documentation. The revision will focus on the leads indicated in this article.

Topic 1.5 Requests for a change to waivers or issue new waivers

Waivers can be introduced but only if good justification is provided by the MS. It should be guarded that by creating waivers the evaluation becomes pointless, as any waivered issue is no longer subject to further analysis.

Therefore each waiver introduction must include specific general and local conditions for its application.

If a member state wishes to propose new waivers to be considered for the 2011 campaign, please fill in the template and mail it. Before submitting a waiver, please ensure

  1. the proposed waiver involves either contamination or potential critical defects
  2. that the particular issue is not yet addressed by modification to the measures involved.
  3. that is is duly motivated and illustrated

Submitting a proposal doesn't automatically involve acceptance. Only waivers listed in the Annex 1 are valid.

To ensure a better service for the delegations i.e. to keep a balance in the guidelines provided (> topic 1.4) a forum whereby (a number of) delegations could pre-evaluate the need for a waiver can be considered.

Topic 1.6 Why is it required to redigitise? Can we not copy/paste?

See the Copenhagen workshop Q&A

Topic 1.7 Why consider parcels that are not-declared for aid?

It is a legal requirement that all agricultural area on the holding shall be declared and hence, also the parcels not claimed for aid must be in the LPIS. More precisly Article 19(1) of Regulation (EC) No 73/2009 establishes that farmers shall declare all the agricultural parcels of the holding. This implies that farmers must declare not only the parcels in respect of which they claim aid but also any other unclaimed parcels of the holding. The main purposes of this obligation to declare all parcels are to enable effective cross checks as well as the control of the cross compliance requirements.

In accordance with Article 55 of Regulation (EC) No 1122/2009, farmers might be subject to reductions in the case where they have omitted to declare one or more parcels. However, this Article does not provide a legal basis for imposing sanctions in the case where the farmer has declared all his parcels but with an underestimated area, i.e. a number of hectares which is below the size determined by the authorities.

CONCEPTUAL ISSUES

Topic 2.1 Quality Element 5 is relevant, but needs revision

As described in the rationale, one purpose of the LPIS is to support the farmer declaration process. In an ideal world, the farmer declaration should be no more than a confirmation of the reference area available, but the 2010 experience has shown that reality is not always ideal: In many cases the area declared doesn't equal the reference area available, e.g. because not all farmers on a reference parcel apply for aid.

The impact/support of the LPIS on the farmer declaration can be critical for two functional issues:

  • If the reference area is too small, the farmer cannot apply for aid on all available land.
  • If the total declared area on a parcel is unrelated to the reference area, it provides opportunities for improper declaration of land.

The first issue can be expected to solve itself as parcels with an underestimated reference area should trigger the farmer to request an reference area update. By contrast, an explicit indicator could be considered to verify whether incompletely declared parcels of year N, suspiciously become more declared on year N+1.

Topic 2.2 Quality Element 6 is relevant, but needs revision esp. regarding the link to update needs

Update is the most important challenge for any GIS, including the LPIS. Several different processes all contribute to keeping the information up to date (see LPIS update and a forced "acute refresh" (throwing away a database to replace it by a newly produced) is by far the worst option.

The purpose of QE6 is to monitor the update processes so that an acute refresh can be avoided.

The categorization of non-conforming and defective parcels a failed update cause in QE3 provides an indication of the overall failure of update processes, but gives no information on the actual update performance of any individual process. The relative abundance of transactions triggered by each of the actors (farmer/inspector/LPIS custodian/national mapping agency) does and if these transactions are managed effectively, they should be completely processed prior to the next claim period.

These considerations can be formulated into 3 leads for the updating process.

  1. The rate of reference parcels that farmers indicated subject to change should not be different more that 25 % from real annual change rate observed during the OTSC (previous or current year).
  2. The rate of missed updates observed by the OTSC inspectors should not deviate more than 10% from the change rate observed by reference parcel sample inspection under QE3.
  3. 98 % of the reference parcels changes detected since the start of the previous claim period should be fully processed at the start of the claim period. This rate is derived from IACS register query of the previous year. This could maybe be explained a bit tomorrow?

The first two leads imply that OTSC inspectors are able to determine the "annual change rate" and "rate of missed updates" as part of their LPIS update role


As with the other measures, QE6 should not deterministically trigger a reaction but lead to analysis and a sound remediating plan. > topic 3.1

Topic 2.3 Quality Element 7 is relevant, but needs revision esp. regarding the link with OTSC

Member states have indicated that LPIS is only one of the many possible cause that lead to irregular applications and that is difficult to extract the correct irregularities from the IACS query. Finally some member states indicated that the LPIS QA sampling should be respected in this measure.

Both LPIS QA and OTSC inspections resort to sampling procedures and a key challenge is to achieve a representative sample of reference parcels common to both inspection procedures. Obtaining sufficient common reference parcels mainly depends on the OTSC strategy:

  1. Member States applying the CwRS program probably need no action as, on European average, about one third of the agricultural area of the CwRS-site is subject to CwRS inspection. So the random OTSC zones should provide a sufficiently large common CwRS-LPIS QA sample.
  1. Member States relying on Field Inspections only shall need to specifically select a number of claims of their OTSC as to cover a sufficiently large common sample. As the LPIS QA sample is by definition random, the OTSC checks on this would also be part the random OTSC sample.

As QE7 aims to demonstrate that LPIS is NOT a key contributor to irregular claims, two leads can be proposed and applied on the common sample:

  1. Not more that 2% of the common reference parcels are claimed for an agricultural parcel which belongs to a crop group that was determined to be over declared.
  2. The rate of irregular claims on farmers declaring on non-conforming or defective reference parcels, should not be significantly different from the overall OTSC rate of claims with irregularities (of the common sample)

Topic 2.4 The 5-meter buffer needs more clarification

The 5 meter concept was introduced to compensate the removal (from trial ETS v1.0) of measure 10101 for absolute positional accuracy after the feasibility trial. See feasibility report “ 18.2.5. The check for positional accuracy (quality measure 10101 –not required for 2010R146-) of the border has been found to be complex and time consuming, without a very clear purpose and use...”.

The 5 meter buffer around the RP boundary accommodates for a coordinate shift in any direction (i.e. deviation in absolute coordinates) for identifying the LUI.. Areas measured by the CAPI delineation are not affected by such shift, they are however very effected by the relative coordinate accuracy of the imagery used for inspection (i.e. consistence of scale throughout the image). ETS contains no measure to quantify the relative coordinate accuracy. > topic 1.1

Topic 2.6 The need to got back to last year – create a waiver (QE2)

There must be a starting point –the reference data- and at the beginning of the declaration, the situation indeed refers to the previous year and the RP data may well be corrected (art 12.4) by the farmer during the process. The Commission services acknowledges this situation and its adverse effects on the scores:

  • A waiver is however not the appropriate instrument to deal with this farmer’s update as a waiver relates to a particular measure. The update is effecting most, if not all measures and should therefore be accounted for in the methodology.
  • The Commission services therefore proposes an additional step in the methodology, in particular in the data preparation, to update the reference area from the pre-printed form with a new area provided by the farmer before he made his application (not the result of OTSC inspections), provided the Member State demonstrates that the rate of farmer updates in the LPIS QA zones is comparable with the national average.

COMMUNICATION ISSUES

Topic 3.1 LPIS QA BASIS

To help the rationale on the Commission services position, it important to understand the following rationale

  1. The ETS is developed as common inspection procedure that outputs comparable raw observations from all MS
  2. These raw observations , compiled in the ETS scoreboard, are a common basis for analysis by the MS
  3. The thresholds applied on this ETS scoreboard act only as a trigger for further analysis. (below the threshold no explanation is required)
  4. This further analysis could and should isolate and clarify “raw issues” that are not a problem for the conditions in the Member State. This can possibly lead to the compilation of an alternative ETS scoreboard.
  5. A remedial action plan should be based on the results after analysis, not of the raw ETS-scoreboard.

The objective of the EC is enabling the MS to produce a good assessment report and remedial plan. For this it is essential that the guidelines are followed, in particular as regards the interpretation of the objects and the application of waivers.

The Commission services considers thresholds, waivers and tolerances as methodological instruments that vindicate issues well before they enter the raw ETS-scoreboard. As these instruments prevent “reporting noise“ they are very useful, but too much “filtering” will prevent the analysis of any true signal that lays hidden in the raw observations.

Topic 3.2 ”If thresholds are not met this is not necessarily a problem, but (through the scoreboard) externally communicated as one”

Delegations expressed the view that it would be better to do away with all thresholds as they create problems of "non-compliance". Thresholds are important and should be kept, as they give the opportunity to the EU MS Administrations to decide whether an action needs to be taken. To evaluate a system and to see if actions are required benchmarks are needed. > topic 3.1

Topic 3.3 ”It is not because thresholds are not met that there is a risk for the Fund”

The purpose of the LPIS QAF is not immediately to determine risk for the Fund. The purpose of the LPIS is firstly to provide correct information to the farmers as regards what can be claimed i.e. the enable Administration of the claim (iAcs). In this way it is a system designed firstly to avoid problems. Only afterwards is the control (iaCs) The LPIS QAF is to see if measures are required to ensure that the LPIS fulfils this role.

ELIGIBILITY ISSUES

Topic 4.1 Landscape Features

To be eligible, a landscape feature should always be inside or directly bordering some "traditional" agricultural land:

  • those that are traditionally part of good practice (Article 34(2)) have a maximum width and can be, at the discretion of the member state, considered eligible
  • those that are protected by a national GAEC for retention (Article 34(3)) , are defined by the member state, are not subject to size restriction and are by default eligible
Therefore the latter features have to be taken into account in the reference area of all LPIS reference parcels and their retention must be monitored.

It is not feasible to define generally applicable technical guidelines on how to account for these landscape features in the LPIS, as both approaches on landscape features depend on respectively the regional traditions or on the national GAEC measures rather than on pan-European concepts. Each MS will have to develop a solution where it can demonstrate that the farmer is informed of the presence and eligible area of a landscape feature and, when appropriate, the inspector is able to control its retention.

The existing technical guidance on landscape features relates to the work of the inspector, and how his findings (position and area) are brought into the LPIS for the inspected reference parcels.

OVERVIEW ON THE RELEVANT RULES

When measuring the areas eligible for payment, ineligible parts of the area concerned shall be deducted. However, Member States may consider certain landscape features (for example hedges, ditches, walls) where those are traditionally part of good agricultural cropping or utilisation practices, as part of the eligible area, i.e. they do not have to be deducted. This is under the condition that they do not exceed a total width to be determined by the Member State (Regulation (EC) No 1122/2009, Article 34(2)). That width must correspond to a traditional width in the region in question and shall not exceed 2 metres.

Furthermore, Member States may recognise landscape features as being part of the GAEC obligations under cross compliance. In such a case the features in question do not have to be deducted from the eligible area in a parcel, i.e. the feature becomes eligible for payment (Regulation (EC) No 1122/2009, Article 34(3)).

Besides, the current EU rules foresee certain flexibility. An agricultural parcel that contains trees shall be considered as eligible area provided that it does not hinder the carrying out of agricultural activities (Regulation (EC) No 1122/2009, Article 34(4)). The "Guidelines for area measurement" (European Commission, Joint Research Centre, Guidelines on Article 34 of Regulation 1122/2009, Point 1.2 - http://mars.jrc.it/mars/Bulletins-Publications) point out that an agricultural parcel containing trees with a density of more than 50 trees per hectare should, as a general rule, be considered as ineligible. The Guidelines also foresee that in order to assess the eligible area within an agricultural parcel of (permanent) pasture, Member States can use a reduction coefficient in the form of a pro rata system or a percentage reduction.

Moreover, according to the Guidelines (European Commission, Joint Research Centre, Guidelines on Article 34 of Regulation 1122/2009, Point 2.6.2 - http://mars.jrc.it/mars/Bulletins-Publications) ineligible landscape features smaller than 100 m2 have to be deducted from the eligible area only if the total of these landscape features present a significant area of the parcel in question (that is, when the total of all these small ineligible landscape features within the parcel exceeds the tolerance of the parcel calculated as the buffer width of the measurement tool - maximum 1,5 metres - multiplied by the external perimeter of the agricultural parcel (Regulation (EC) No 1122/2009, Article 34(1)). Above the technical tolerance all ineligible landscape features in the parcel have to be deducted from the eligible area.

In addition, EU legislation contains certain provisions which ease the treatment of minor over-declarations discovered during the checks. In case the difference of the area declared by the farmer and the area determined by the controls is maximum 0.1 hectare per application the aid to be paid to the farmer is not reduced, but the farmer is paid for the area declared for the payment (Regulation (EC) No 1122/2009, Article 57(3) 2.indent).

Topic 4.2 Eligibility on marginal areas

In the exercise of assessing the quality of the LPIS, it is appropriate to use the approach towards eligibility of marginal areas which is used by the authorities when establishing and updating the LPIS. The approach should be set within the legal framework for eligibility of areas which is given in Reg. 73/2009, Reg. 1120/2009 and Reg. 1122/2009.

The subject of eligibility was exhaustively discussed in the Managament Committee for Direct payments in 2009/2010. In that context document DS/2009/29 was presented in the meeting of the Management committee for Direct Payments on 26 November 2009. Annex II of this document lists the legal provisions relating to eligibility. Furthermore, some clarifications relating to the issue of eligibility, and in particular marginal areas, can be found in DS/2010/04 rev 1 which was discussed in the meeting of the Management committee for Direct Payment on 31 March 2010. Any changes of the legislation is not foreseen for now.

Topic 4.3 GAEC eligibility is SPS related

GAEC becomes an eligibility condition for areas under SPS in the case where no other agriculture activity besides the GAEC maintainance is taking place on the parcel. Forests is not eligible for payments under the first pillar except in case where it is covered by Article 34(2) of Regulation (EC) No 73/2009(afforestation). See previous point for further reference to the eligibility rules for the SPS and the SAPS.

ADDITIONAL REACTIONS

The following topics were not mentioned during the first reaction at the closing session, but offer a response to additional issues identified during the presentations and working group findings

Topic 5.1 reduction coefficient for the pro rata land cover classes

Several member states have indicated that, for marginal lands a reduction coefficient is used that was determined on a parcel by parcel basis. They experienced difficulty recording the results as LPIS QA eligibility profile imposes one coefficient or fixed rate for each type of land

The reasoning for this pro-rata approach with a fixed coefficient is documented in a Bergamo presentation

If the resulting area differences observed during the LPIS QA for the individual pro-rata parcels cause non-conformities that, in the view of the member state, not necessarily indicate a true problem for the LPIS as a whole, a separate analysis of this set of pro-rata parcels land is in order to demonstrate that no bias is present for the total area of agricultural land stored in the system.

Topic 5.2 Rationale behind the thresholds (or better the quality expectations)

EXPECTATIONS

  • QE1 (total area):
2%: threshold for serious error in the Court of Auditors DAS methodology - update: the document has been updated and defines this as "materiality threshold"
  • QE2 (rate of area based non conforming parcels):
3%: this threshold difference is twice specified in the Comm Reg 2009R1122: Both cases relate to a comparison between an area observed and an area declared.
  • in Article 58: Reductions and exclusions in cases of over-declaration: the area declared for the purposes of any area-related aid schemes, ..., exceeds the area determined ...if that difference is more than either3 %...
  • in Article 55: Non-declaration of all areas: the difference between the overall area declared in the single application ... and the area declared plus the overall area of the parcels not declared, ..., is more than 3 % of the area declared.
In a good LPIS Area declared should be derived from the LPIS reference area.
5 % and 7% thresholds include a degree of technical tolerance for smaller parcels as add on to 3%
1ha: maximum tolerance of OTSC methodology
  • QE3 (causes of non-conformities and defects) and QE5 (area declaration rate):
5%: Arbitrary: serves an indicator/alert function.
  • QE4 (rate of defects)
LQ2: a LPIS should have no true critical defects at all. The limiting quality (in percent nonconforming parcels) is set to 2 as in the threshold for serious error in the Court of Auditors DAS methodology - update: the document has been updated and defines this as "materiality threshold"
  • QE6 (accumulated change rate):
25%: Arbitrary: serves an indicator function.
  • QE7 (rate of irregular applications):
  • 2%: threshold for serious error in the Court of Auditors DAS methodology - update: the document has been updated and defines this as "materiality threshold"
  • not significantly higher then previous year: based on good quality management principles

LQ INDEXES

  • Regarding the verbal expression of the expectation" into Limiting Quality indices (used to determine the acceptance number for attribute sampling):
LQ12pt5.PNG
  • QE4 - for 1% / LQ2 : please look at slides 13-14 of this presentation
  • QE2 - QE3 - QE5 for 5% / LQ8 : The Commission services suggests to move from LQ8 to LQ12.5. Although this LQ substitution mimics a bit the choice recommended in 3.5.1 of ISO2859-2 on AQL/LQ ratio, we must nevertheless stress that the 5% expectation was not expressed as an AQL value from start.
The resulting acceptance numbers are indicated on the right.

Topic 5.3 subparcels / superparcels / hybrids

Several member states have indicated it is not easy to assign a single reference parcel prototype to their implementation.

It can be demonstrated ( see slide 24+ of this presentation) that subparcels and superparcels (= aggregated parcels) can easily deviate from the optimal representation of the land for the purpose of the CAP processes. This optimal situation is not always achievable for all process, but it is worth checking if the particular design doesn’t affect the LPIS QA results:

  • subparceling creates smaller parcels affecting QE2 results
  • superparcels make declarations more “fuzzy”, affecting QE5 results
  • superparcels and mismatching third party boundaries cause potential critical defects affecting QE4 results

On a more general note, these “object referencing” cardinality (multiplicity) considerations are also relevant on how to implement landscape features in the reference parcels if the are located “on the immediate border of the agricultural parcel”, in particular for landscape features common to two neighbouring agricultural parcels:

  • a separate identification of feature as a reference parcel will lead to declaration by on the same land two farmers , inducing a risk of a double declaration of that area.
  • inclusion of half the feature to each of the bordering parcels will lead to invisible and thus arbitrary adjudication of the land.

Topic 5.4 validation of the methodology

Prof. Arnold Bregt called for

  1. a third party/peer evaluation of the evaluation procedures
  2. a systematic meta-evaluation of the assessment approach

The Commission services propose to do this evaluation first inside the LPIS community as it is there where the true validators can be found. Independent experts who are not familiar with the CAP domain will face difficulty to assess the LPIS QA without prior knowledge on CAPI implementation. The JRC is obviously not independent on these matters but is willing to facilitate and support these evaluation activities.

Topic 5.5 ATS developments

The LPIS Core Model (LCM) is a logical model translating the CAP legislation into the geoscientific terms. On the contrary, all MS LPIS databases are physical implementations of the CAP legislation. The ATS focus is on testing if a physical implementation is conforming to the LCM logical model, so that the ETS can be correctly performed (correct scope, correct values).

Ongoing modifications for the 2011 LPIS QA will focus on the interaction between ATS and ETS:

  • Revision of the ATS modules and provision of a new template to accomodate that some of the original ATS modules were transferred to separate specialized documents (A_12 > Eligibility Profile) whereas others were inactivated as they do not address issues within the 1st pillar scope (A_132 = cross-compliance).
  • A_11 reference parcel definition: clarification of procedure and requirements.
  • A_12 eligible land cover types and landscape features: better fit eligibilty profile
  • A_13 reference parcel attributes: accomodate ETS concepts
  • A_131 mandatory attributes: clarified and tuned definitions
  • A_132 cross-compliance attributes: temporarily suspended
  • A_133 other attributes: clarification
  • Introduction of features to address:
  • sub-parcel/super-parcel,
  • attributes specific to all declared land,
  • reference parcel polygons resulting from (on-the-fly) geospatial operations,
  • attribute values derived from multiple attributes or calculated on-the-fly
  • Revision of the ATS terminology to better align it with the ETS inspection methodology.
  • Restructuring of the ATS reporting package items to better separate the ATS-conformance testing (> ATS-Log) and the ETS-support documents that are produced during the testing but with annual relevance (> waivers, eligibility profile, ICS).
  • No major changes are foreseen to the LCM in short term.

Topic 5.6 ETS developments

From the above discussions, the Commission is considering the following changes to be implemented in ETS v5.0. Please note this summary is provisional, the final changes depend on insights gained during from screening the 2010 observations and the actual implementation of the measures:

  • general methodology changes
  • enabling the update of the reference area by farmer (only) during the application period
  • as member states should by now have implemented measures to deal with the issues of subpopulation, the distinction total population versus subpopulation is removed ( > topic 5.7)
  • revise documentation
  • more clarifications on orthoimage specifications and quality expectations
  • XML/GML schema set needs harmonisation due to ETS developments, enabling interoperable and smooth data exchange
  • QE1 : no changes to the measure
  • QE2
  • seperate reporting of parcels larger than a threshold size
  • LQ8 set to LQ12.5
  • QE3: LQ8 set to LQ12.5
  • QE4
  • remove "discontinuity"
  • revise set of waivers based on experience
  • QE5: new indicator / measure focused on monitoring declaration changes.
  • QE6: new indicator / measure focused on monitoring real world change rate / remove "plan refresh" action
  • QE7: new indicator / measure focused on effect of LPIS on irregular applications

Topic 5.7 Scope versus denominator

Many member states are confused with the difference between the parcel numbers entering the inspection or scope on one hand, and the resulting number of successfully inspected parcels for that measure, on the other hand.

Please note that:

  • the ETS 2010 considers only two collections of parcels (scopes) at the start of the activities: total population and sub population (see our FAQ ). The particular scope is seperately indicated for each measure of Annex I
  • during the inspection procedure, it can happen that not all parcels in the particular scope can be successfully inspected, e.g. because of a critical defect, or because of the absence of information in IACS for the inspection year. Therefore it is absolutely normal that the resulting denominator under point 3 of the acceptance decisions ends up smaller than the original scope.

In the 2010 ETS:

  • The scope drives the inspection, depending on lot size, 500/800/1250 parcels of the sub-population scope need to be inspected (this number excludes the skipped parcels, but includes the critical defects), see also our FAQ
  • The resulting denominators depend on what parcels can
  • QE1: subpopulation minus RP parcels that can not be measured (note: not all CD are unmeasurable !!!)
  • QE2: idem
  • QE3: total population
  • QE4: total population
  • QE5: sub-population minus RP that are not declared during year N minus RP that can not be measured
  • QE6: (2010) all applications
  • QE7: (2010) all applications

For example: the case of our FAQ could result in the following final numbers (denominators are underlined):

subpopulation 1250
measured (QE1 and QE2) 1197 i.e. 1115 by digitising, 82 by area recovery
declared 1226
declared AND measured (QE5) 1183
total population (QE3 and QE4) 1466
all applications (QE6 and QE7) 68781
skipped 113 i.e. technically impossible to inspect
ignored 2421 i.e. sequential number > 1579

Working group findings: key issues and suggestions

The plenary presentations made by the working group chairs can be found on the website.

QE1 (total eligible area): comments

  • Marginal areas (how to map?) > topic 4.2
  • Delineation issues > topic 4.2
  • GAEC effect on eligibility > topic 4.3
  • Prioritization of ETS: Delineation of non-eligible areas: > See the Copenhagen Q&A

QE1 (total eligible area): proposed solutions:

  • 1st: Marginal land: wetland, boundary between pastures with or without trees, natural grassland/grassland > topic 4.2
  • to be pro-rata until 2013,>2014 eligible if farmer is carrying out agriculture activity; > Under the current Regulations that land in not fully eligible
  • Flexible thresholds in future "> topics 3.2 3.3"
  • waivers > topic 1.5
  • 2nd: Delineation: small parcels, cadastral parcels as RP, image quality
  • Copy/paste should be OK > topic 1.6
  • Waiver for cadastre parcel > there are several waivers for the cadastral parcel
  • 3rd: GAEC: when do commitments become eligibility conditions > topic 4.3

Eligibility profile: comments

  • Is forest eligibility profile necessary (II pillar)? > it is not required unless for eligible reafforestation
  • Determination/codification of land cover – use of the JRC system or the MS classification system? Use of the common minimum mappable legend is confusing.
>
  • The LCCS is a classification system to describe the land cover classes defined by the member states,
  • the JRC Minimum mapping legend is a legend; a combination of classes for the inspection mapping at approx. scale 1/5000)
  • LFs: why needed in the eligibility profile if not mapped in the given LPIS?
> because the Eligibility profile inventories all eligible land covers, the mapping legend only these which are mapped.

What is the difference between the natural grassland and grassland?

> natural grassland is self-seeded and not a result of cultivation (i.e. not sown)
  • XML cannot cope with more than one pro-rata value (ranges or variable values) > topic 5.1

QE2 (rate of area non-conformities): issues

  • 1st: Differences from digitizing reference parcels
  • Image quality > topic 1.1
  • Radiometric
  • Resolution
  • positional accuracy
  • acquisition moment)
  • Operator subjectivity
  • Especially marginal areas > topic 4.2
  • Not all boundaries are visible
  • Standard error is not included in the inspection > topic 1.2
  • 2nd: Thresholds
  • arbitrary chosen > topic 5.2
  • Conformance level is not OK for small RPs > topic 1.2
  • Shape of RPs should play role as well > topic 1.2
  • 1ha too tight for very large parcels > topic 1.2
  • 3rd: #Timing > topic 1.3
  • Updates not taken into account (beginning of year vs payments)
  • No time for action plan
  • Satellite image quality is not always sufficient (bracken >< rush) > topic 1.1
  • The procedure to apply buffer of 5 meters for RP with an unclear boundary is completely unclear. > topic 2.4
  • Complexity
  • Why map separate Land cover features? > See Copenhagen Q&A
  • Why map small LF? > small LF are only to be mapped if your national specification do so too

QE2 (rate of area non-conformities): proposed solutions:

  • 1st: Differences
  • Copy/paste the RP and then verify > topic 1.6
  • Image resolution should be same as the ortho resolution (used for the creation of the RP) > topic 1.1
  • 2nd: Tresholds > topic 1.2
  • Tolerance should be the same as for the OTSC – perimeter based (max 2 ha) based on objective laboratory study > on tolerance: topic 1.2; on study, topic 5.4
  • 3rd: Timing:
  • Waivers for QE 2, if RP changed by the farmer or OTSC > topic 1.3
  • Change/ apply flexibility for the date when the sample is taken. (MS chooses date) > topic 2.5
  • earlier reporting to enable action plan? > topic 1.3
  • Quality
  • Increase the image resolution to 50 cm > topic 1.1
  • 5m
*Do not put immediately “0” for unclear LUIs. > topic 2.4
  • Complexity:
  • Map eligibility rather than land cover > See Copenhagen Q&A
  • Complex: Reduce the features that needed to be mapped > See FAQ

inspection methods: comments

  • Why not to do GPS-based ETS together with the OTSC
> there is no fundamental objection if the random sampling can be guaranteed. See also FAQ
  • ETS documentation is not clear – the concept of sub-population is not clear. > topic 1.4
  • Underdeclared parcels often digitised too small -> affect QE2 and QE5 > WD: this makes little sense, did you mean too large ?
  • Difficult to inspect the TB > topic 5.3

QE3 (causes): comments

  • Part of the non-conformities are caused by the time difference between data exchange and QA ETS > topic 12
  • Limited reporting on type of causes. (too few causes? More need to be added ?)
>JRC: the 6 (incl. historic GAC) generic causes can surely be detailed or subcategorised to support the analysis, but we haven’t identified another generic cause yet. We welcome proposals based on the 2010 experiences
  • What “incomplete processing” incompatible LPIS-design is?
>
incomplete processing: occurs when someone has not done all what needed to be done
incompatible LPIS-design: indicates the system (LPIS and its operating procedures) is unable to identify any other generic cause for that observation
  • QE3 doesn’t take into account the national legislation
> please explain: the CAP is a common policy, ensuring equal treatment for all EU farmers
  • Use of older imagery should be allowed
> it is allowed in ETS v4.3, be it in an ancillary capacity

QE3 (causes): proposed solutions

  • 1st: latest imagery is not always the best Imagery sometimes is not clear on latest imagery (which is on older). Solutions:
  • Skip the RP
  • Add new non-conformity code, related to the limitation of the VHR
  • Add Waiver
> topic 1.1
2nd: Part of the non conformities are caused by the time difference between exchange of data and QA-inspections (either raster of vector)
  • Add a new non conformity code, imagery contains changes not yet reported on the imagery used for refresh
> topic 1.3
  • 3rd: QE3 does not consider national legislative requirements (splitting due to administrative boundaries)
  • Add a new non-conformity code, LUI contains administrative national boundaries > topic 5.3
  • Create a new waiver. > such waivers already exist

ATS: comments

  • ATS-log inconsistent, some errors found in the XSD
  • Are the cross-compliance elements part of the ETS?
  • What about if a MS has different types of RPs? How to apply then the ATS?
  • LCM model implies that LPIS data is “physically” stored in attribute tables. It doesn’t take into account that some of the values can be generated on-the-fly.
  • LCM should be logical e.g. no “attribute stored” (RPs can be also a result of separate spatial layers)


ATS: proposed solutions

  • 1st: Terminology: ATS mentions several elements `attribute stored’, while it is possible to get it through a spatial function.
Keep ATS model `logical’, implementation can be database field or function, maybe leave space for comments > Agreed: topic 5.5
  • 2nd: How do we identify the rp in presence of separate spatial layers with different attributes? How do we fill the ATS-log since a simple yes/no might not be enough?
  • Leave space for comments and definitions
> not really needed in the ATS templates, the Feature Catalogue or Application Schema hold such information
  • 3rd: ATS-log scheme seems to contain inconsistencies
Fix XSD. > corrections were made in the aftermath of the Amsterdam workshop

QE4 (critical defects): comments

  • 1st: More and Better examples needed > topic 1.4
  • 2nd: Unclear LUI ‘’(boundary)’’ should be better clarified as concept > topic 1.4
  • 3rd: More waivers for PhB and CP > topic 1.5
  • Multi-parcels are not necessarily a CD
  • Explain difference multi-parcel vs multipolygon
Multi.PNG
> two different cases of multiple cardinality, (see topic 5.3)
  • multi-parcel: a situation where the reference parcel aggregates/joines what should be separate RP parcels according to the intrinsic definition/specification used in the LPIS. It could result from any cause besides upgrade
  • multi-polygon: a situation where a single identified "unit" of land (i.e. one parcel-ID) relates to two or more distinct/disjunct plots in the field. It could result from erroneous processing or incompatible LPIS design
  • Local conditions should be taken into account when evaluating a RP > conditions are already a structural element of the waivers

QE4 (critical defects): proposed solutions

  • More examples and definitions through WikiCAP > topic 1.4
  • Use of SDIC forum > would be much appreciated, topic 5.4
  • More examples needed > topic 1.4
  • Virtual/physical workshops for each type of RPs > topic 5.4
  • Peer-group to evaluate the LPIS QA results > topic 5.4
  • Drop all CDs except “no eligible area found” topic 1.5, but revision of the CD selection will be made during the screening

Reporting: comments

JRC note: there has been some confusion on the scope of the topic reporting , It was intended to process the experiences of the assessment report and remedial plans, but seems to have focused on the data exchange of the inspection results. As a result, there is considerable overlap with the findings of the next working group. > topic 3.1

  • Not clear for some elements – explanation needed for every point
  • If there will be a technical issue with the XML, is there any backup? ‘’> technical issues with standards are unlikely, if the standard are respected
  • Why was XML chosen? JRC should make tools available to check if files are ok.
>
  • because of INSPIRE compatibility. Note that also GML is an XML grammar written in XML schema . Relevant GI standards are ISO 19118 (Encoding) and ISO19139 (GML)
  • for validation tools see our tools article
  • Live demo on reporting needed
  • XML generates extra work for the MS and extra cost
> The effort is acknowledged, but it is an essential investment for an application which relies on GIS technology

Reporting: proposed solutions

  • 1st: clear information on what is exactly needed > topic 1.4
  • 2nd: We need a backup alternative > full web based access to the data (WFS) offered by the LPIS custodian would remove the need for the data exchange
  • 3rd: More support in general (presentations/wikiCAP)
  • Alternatives for XML and GML should be available (xls, dbf, shp)
> each of the three proposed alternatives has distinct disadvantages that substantially reduce their potential for international data exchange
  • Provide tooling / demonstrations > new tools have been places on our tools article
  • Clear examples needed > topic 1.4
  • Sharing information between EU Member States is good thing > agreed
  • Option should be available in the Portal to view the XML after updating > JRC will consider this option

QE5 (declaration rate): comments

  • 1st: Monitoring “area declared”
  • implies land use concept, which is not relevant to LPIS > topic 3.3
  • It checks something that cannot be controlled, as there is no way to check farmers declarations
  • Under/over declaration – negative impact through false non-conformities > topic 3.3
  • 2nd: Uncertainty on the geometries digitized from scratch
  • allow partly corrected RP > what is meant by partly corrected RP?
  • Correct quantification of RP areas already measured by other QE > topic 3.3
  • 3rd: Need of further clarifications on the terms: > topic 1.4
  • Declared area
  • Claimed area
  • Farmer area
  • Paid area

QE5 (declaration rate): proposed solutions

  • 1st: skip 0% declared RPs from the sample (interpretation of table 15)
>
  • see topic 2.1
  • please note this is opposing the Dutch proposal to focus on the 0% declared
  • 2nd: remove the threshold and make the QE5 indicative value only > topic 3.2
  • 3rd: introduce waiver for under-declared RPs > topic 2.1

Data exchange: issues

  • 1st: Big waste of time to acquire the know-how on XML/GML
> The effort is acknowledged, but it is an essential investment for an application which relies on GIS technology
  • 2nd: No easy solution in sight : invest in that know-how
  • 3rd: Unstable XSD structures (many update from JRC) > changes are kept to an absolute minimum

Data exchange: proposed solutions

  • 1st: Please provide tools for GML and XML generation and handling! :
> new tools have been placed on our tools article
  • 2nd: Please warn for any changes in the schema versions (through watch page?)
> the standalone log article holds all changes to schemas since 20/12/2010
  • 3rd: Freeze the documentation and regulation :
> methodological changes are made upon request of the member states or to address urgent technical issues identified by the member states > topic 1.4

QE6 (land changes): comments

  • Does the 25% fit to the real world change rate or is it arbitrary? > topic 5.2
  • Is the imagery used for change detection, or for verification of detected changes?
> neither, the LPIS QA should detect unprocessed changes
  • Regional difference (e.g. ENG >< SCO)
> obviously "stable" landscapes require less update effort than those where competiton for land occurs (often from urbanisation pressure)
  • Difference between real world change and image model change
> changes reflecting the data in the model
  • What about other channels than farmer/inspector? Systematic changes of Mapping Agency included? > topic 5.2
  • Can one common approach be applied for all different LPIS models?
> Yes, all models should be equally capable of dealing with the change occuring
  • What attribute changes are relevant
> those attributes that have a significant effect on the eligibility represented by the reference parcels
  • What type of changes, ineligible features boundaries > both, see above
  • Background 25 not understood > topic 5.2
  • How does it relate to good update procedures/performance? > topic 2.2
  • Use ETS for producing change signals rather than measuring change > topic 2.2

> for more information on the LPIS update cycle, please look at the LPIS dataflow article

QE6 (land changes): proposed solutions

  • 1st: Why the 25% threshold
  • Keeping track on the change rate is good and useful, but it should be a criteria. Alternatively, we could think to apply different thresholds > topic 3.2
  • Why not to use the inspection results to establish the change rate threshold? > topic 2.2
  • 2nd: What are the relevant types and source of change
  • Look at rwc and determine baseline per MS > topic 2.2
  • Use inspection results to assess the change detection level > topic 2.2
  • 3rd: Fixed change rate threshold does not reflect different regions and different type of RPs suggested
  • Introduce different threshold based on rwc rates > topic 2.2

Orthoimagery: comments

  • Use NIR > topic 1.1
  • poor imagery affects LPIS QA (bad GCP, poor orthorectification) > topic 1.1
  • uncertainty due to subjectivity > topic 1.1
  • Inappropriate elevation angle cause poor orthorectification > topic 1.1
  • Occlusion of tree and buildings; shadows cause CAPI ambiguity > topic 1.1
  • Time of the acquisition (snapshot of winter/spring) > topic 1.1
  • Phenological development is sometimes an important factor > topic 1.1
  • At scale 1:10 000 – image of 1 meter resolution is good, but not for 1: 5000! > topic 1.1
  • Can we downgrade LPIS data to match 50 cm. ortho quality? > No, deterioration of LPIS quality is not allowed.
  • Test conditions should be equal the LPIS maintenance conditions > topic 1.1

Orthoimagery: proposed solutions

  • 1st: Pixel size and visual scale for CAPI impact the LPIS QA outcomes
  • To apply greater tolerance? > topic 1.2
  • Or even generalize to 1: 10000 > No, this would decrease the sensitivity of the assessment
  • standardize the scale to 1: 10 000 > No, this would decrease the sensitivity of the assessment
  • Acquire and produce own imagery with the desired resolution and parameters > Already allowed
  • 2nd: The “base” image quality should be always met
  • Increase image budget or decrease the thresholds
  • Source own imagery > Already allowed
  • 3rd: Bad timing of image influence the results as well
  • Multiple acquisitions in order to create multi-temporal data? > topic 1.1
  • Lower angle images might be less affected by the tree overhang > topic 1.1
  • Source own imagery > Already allowed

> for all the unaddressed points above, please look at the extensive topic 1.1

QE7 (rate of irregularities): comments

  • 1st: What is area not found?
  • area not found or area not paid because of an error?
  • causes for area not found OTSC
  • Non-agriculture land found in declaration
  • Parcel Over declaration
  • under minimum size
  • parcel not found at all
  • eligible land but not used for farming
  • 2nd: separation of eligibility conditions on the land
  • Area not found: where land cover conflict or ineligible because of administrative rules?
  • Positive difference on application by application basis
  • 3rd: over declaration and compensation
  • Random sample is taken at farmer level so different approach
  • Area non found at dossier (business level by PA)
  • Area not found assed at parcel level in ETS
  • Justification of QE7
  • What is the correlation between the area not found during the OTSC and the LPIS quality?
  • Over declaration should be picked up at the time of the application process.
  • Difference from MS declaration management and 100% over declaration test will introduce difference irrelevant for LPIS QA

QE7 (rate of irregularities): proposed solutions

  • 1st: What is area not found?
  • decision needed on the exact meeting (same as reported by the PA or not?) > topic 1.4
  • clear definition
  • 2nd: separation of eligibility conditions on the land
  • Count only land connected “area not found”
  • Non-agri land in declared area
  • Found less than declared (over declaratoin)
  • Parcel not found (withdrawn at the end)
  • Recalculation of value, this is not the value reported by the PA
  • 3rd: over declaration and compensation
  • consider the compensation occurred not at crop group level but at parcel level
  • Count only those parcels in the Total population, parcels outside the LPIS QA zone should not be counted
  • Justification of QE7
  • Different relevance for different RP type
  • AP most relevant
  • If area admin control is done during declaration, higher relevance
  • PB so relevant
  • Analyse declaration procedure and result
  • 2% is too strict. And define on dossier level linked to a penalty philosophy > topic 5.2
  • Inconsistent sample issues
  • selection of RPs is based on previous year (2009) so parcels newly declared in 2010 are not in the sample BUT these parcels are used for QE 6 and QE 7

> for all the unaddressed points above, please look at the extensive topic 2.3

Population and scope: comments

  • 1st: Problems with small parcels
  • Measurements below the tolerance limit for specified image quality > topic 1.1
  • Often issue for AP > topic 5.3
  • 2nd: Undefined RP in QE5
  • RP with critical defect or inability to determine LUI are considered in QE5, although the area is undefined
  • 3rd: Parcels with no SAPS/SPS payment
  • Should ”declared parcels” include RPs with no SPS/SAPSpayment but subject to GAEC or other payments? > topic 1.7 and LPIS QA scope.
  • Independent selections
  • how to select the sub population?
  • Sampling sites must be representative for the ETS
  • Risk factor used for OTSC must be independent of LPIS
  • Must also be independent of IACS elements
> See LPIS zone selection

Population and scope: proposed solutions

  • 1st: Problems with small parcels
  • Exclude RP below a certain size from ETS > topic 1.2
  • This exclusion size can be based on image quality criteria > topic 1.1
  • 2nd: Undefined RP in QE5
  • Exclude RP with unclear boundaries from QE5
> by setting Aobs and Arec to zero, these parcels are already effectively removed from the QE5 results
  • 3rd: Parcels with no SAPS/SPS payment
  • Exclude parcels with are part of the declaration but no SAPS/SPS payment on
> topic 1.7 and LPIS QA scope
  • Inconsistent sample issues
  • Use the same set of RPs for ETS-CAPI and for IACS value QE > topic 2.6
  • We need to include the PBs newly declared in 2010.
> these “new parcels” are “in process” and therefore represent a heterogeneous subpopulation and seperate lot; see our FAQ . Testing this "new" lot probably requires a dedicated executive test suite (and more work for the member state)
  • Both years could be used: previous and the actual year. > see reply above
  • Independent selections
  • Further study on the risk factors used to select the sites, how they can be
  • connected to the LPIS RPs.
  • Use only those risk analysis sites can be in the sample, when no supervised selection (risk
  • or manual shifting to a problematic area)
> Please do this analysis and indicate potential problems, JRC will investigate during the screening whether a simple ANOVA might indicated dependencies on risk factors known prior to zone selection