LPISQA Legacy/2010 Workshop Copenhagen

From Wikicap - European Commission

This document contains the positions of the Commission services on the issues identified by the working groups during the 2010 LPIS conference, held at Copenhagen, September 20th-22nd . It holds 2 parts

  1. the individual Q&A, in sequence of the quality element concerned.
  2. an inventory of issues and suggestions identified by the working groups with an index to the relevant Q&A topic.

These positions apply for the 2010 LPIS QA implementations and are, where appropriate, integrated in the version 4.3 of the annexes and coresponding guidelines.

The Commission services may revise their positions upon the findings, experiences and results of the 2010 LPIS QA implementation.

Ispra, Brussels, 15/10/2010



1. The ATS is complicated and some concepts are unclear

  • On the complication side, there is not much that can be done:
  • the ATS is no more than questionnaire that relates to the various elements of a model and the model itself represents requirements from the Regulation and best practices. In respect to the LPIS QA, which is focused on pilar 1 aids, one could consider module A_132 (cross-compliance attributes) accessory.
  • The eligibility profile (part of the implementation conformance statement) is no more than an enumeration of elements identified module A_12
  • Regarding the lack of clarity of some aspects, the JRC takes note of these comments. However, please observe that
  • the current version of the ATS, its concepts and relations, still relates to the LPIS core model representing pre-health check specifications.
  • For several reasons, JRC chose to freeze the model and thus the ATS until the results of the first application become available. Preliminary results indicate that this 2010 exercise has uncovered a number of practices that should be addressed in the model (e.g. sub-parcel/super-parcel, the equivalence of all declared land,...).

An upgraded model of the LCM is needed before a new version ATS can be published. This new version will address the issues raised.

2. What should be the reference date for the input LPIS data, used in the ETS inspection – the date of the generation of the pre-printed forms for application or the date for the cross-check?

The Commission services acknowledges that for AP and FB RP-types, a lot of changes of the reference parcel are expected and processed in the period between the pre-printed form and the time of declaration by the farmer.

For 2010, the Commission services consider that the date on which the pre-printed form was produced represents the common reference data for the assessment of reference parcel information. This date ensures the availability of a documented status of information and of a uniform methodology covering all designs and member states. A number of the issues expected to occur with AP and FB RP-types are specifically addressed in the inspection guidelines.

3. Why submit the total RP population for sample-preselection?

This full population delivery is needed to verify the completeness of the LPIS population and representativeness of the sample. Both elements shall be verified at the screening stage. Because a very small sample is inspected, LPIS QA results would be biased if some categories of reference parcels were excluded from the sampling process; a yearly extract of the full population allows automatic detection of such exclusions.

As the full population involves a large and cumbersome dataset, the Commission services are open for alternative methods that allow verification of completeness and randomness.

4. Why map the land cover and not simply map eligibility?

The scope of the LPIS QA is to provide MS with knowledge over their full system. The Commission services request the collection of this detailed information during the inspection, in order to enable analysis of the nature, source and reasons for the problems (anomalies) found. The inspection of the reference parcel at appropriate land cover level can provide more evidences in support of certain findings, during the screening. Also, as land cover is independent from aid scheme, this information will become of primary importance for any activities in near-future, related to cross-compliance and second pillar of CAP.


  • The appropriate land cover classes are explicitly defined in 2009R1120 art 2 or by the Member State by way of its GAEC legislation.
  • Unlike eligibility, land cover is stable over time and independent from member state. This allows a robust and uniform inspection method common to all member states.

To cope with an alleged extra cost of delineating land cover classes, rather than producing a single eligibility mask during inspection, the Commission services:

  1. encourage using automated detection and delineation methods that give the necessary guarantees to correct interpretation
  2. clarify that, unless coupled payments or pro-rata classes are applicable, the delineation key should NOT address the agricultural parcel level details, but SOLELY reflect "aggregated" land cover classes defined in R 1120/2009 art 2 and R 73/2009 art 124. These are “arable”, “grass”, “natural grass”, “permanent tree crop”, “permanent scrub crop”, “greenhouse”, “irrigated rice”, “short coppice plantation” and “kitchen garden”.

The Commission services stress that the delineation of appropriate land cover classes is required only for the LPIS QA inspection. It does not require the LPIS reference parcels to differentiate this way neither graphically nor alphanumerically.

5. How to deal with temporary (ineligible) land cover features?

The classifiers used for the land cover types relevant in this domain are not affected by temporary phenomena. So, if the inspector can determine a feature temporary, considering the local context, he should ignore that feature and simply apply the “underlying“ land cover class.

It is a judgement call where the interpreter should apply his knowledge of the local practices: Some examples: • The covering of grassland or arable land with a thin layer of sludge from the neighbouring canal will not change the long term nature of the land cover. • A visible spray track on arable land will most likely be ploughed under by the next year. However a path between to gates in fence is likely to persist. Although such temporal variations influence the land cover appearance, they do not influence its nature or description, and so the classifition works independently of the date of observation.

6. Different datasets (ortho-images) are used for the LPIS update and the ETS, Are the mapping results comparable?

The WikiCAP guidelines indicate a series of practical guidelines to consider regarding the use of CwRS imagery in the context of the LPIS implementation and of the landscape concerned. If in doubt, the MS can consider to acquire dedicated imagery with the same specifications as its LPIS ortho-imagery for a selection of the LPIS QA zones, while ensuring randomness and currency.

For 2010, Commission services will evaluate the MS's explanations that non-conforming results are caused by the sub-optimal nature of the dataset used.

7. Why re-digitize a parcel that appears unchanged on the imagery?

Formally, re-digitizing of the reference parcel boundaries is not requested, what is required is the delineation -via the various land cover features present on site-, of the agriculture land, which can be eligible inside the LUI. This mapping procedure provides not only a total area measurement value but also more detailed information on the nature and abundance of the eligible land contained within the reference parcel.

Fundamentally, the digitizing process is the default procedure to collect an independent observation and measurement on a parcel. Random variations of the observed values are an element for the probability statistics that are the basis for the acceptance decisions. Mixing “copy/pasted” and observed data in the sample can create a heterogeneous sample that does not allow a robust conclusion of the results as long as there is no rule to ensure that “copy/pasted” area/boundary is really “true”. As a result, the current method does allow visual inspection, but only in cases where no challenge to the recorded maximum eligible area or correctness of the boundary can be made.”

8. Can additional information from rapid field visits (RFV) be used in support to the visual interpretation?

RFV can support visual interpretation. But please note that the main purpose of a RFV is not to provide supplementary information in respect to a proper delineation of an unclear LUI boundary. RFVs are primarily intended to clarify unclear cases of land cover/land use interpretation.

For boundary delineation with field instruments (GPS etc), there is not yet a validated survey procedure available, but even if there were, this would not be considered a RFV but rather a “terrain inspection”.

9. Why distinguish between over- and underestimation of the maximum eligible hectare? Only he first one implies a financial risk to the fund.

The Commission services are not only interested in the financial risks to the fund, but also in the ability of the system to give the proper farmer support for the declaration process and to give an indication of the potential risk. As the differences are reported in gross terms appropriate conclusions can be drawn.

10. Tolerances need to be introduced (especially for area based non-conformance)

The Commission services has a clear interest in the reporting of the original “raw” observations but the Commission services also acknowledges that non-conformances based on exceeding an area difference threshold, can be influenced by parcel size, source image characteristics and landscape as well as interactions between them.

Considering that the Commission services desires a simple and uniform inspection method and that Member states have some control over their source imagery, a dependency of the conformity level threshold on parcel size has been introduced. The resulting variable threshold is based on the 3% accuracy expectation and the theoretical mean polygon measurement uncertainties (“tolerances”) with 50cm GSD imagery and becomes (further reading):

Reference area conformity threshold
> 5000 m2 3 %
2000 – 5000 m2 5 %
< 2000 m2 7 %

In the LPIS QA methodologically, Member States shall report the “distribution of reference parcels where the maximum eligible area takes ineligible areas into account or where it does not take agricultural area into account” with the raw observed data, but assess the proportion of non-conforming parcels using the above variable threshold.

The introduction of these size dependent thresholds creates "safety margins" of up to 140m2 or 250m2 for the smaller parcel categories. As a result, e.g. parcels with a newly constructed building inside the LUI could easily escape detection as non-conforming if only this area-based conformance test were applied. To prevent this escape, the guidance introduces a separate conformance test on the already observed occurrence of particular ineligible features inside the LUI.

11. Why not to report only the true defects and skip reporting “potential” cases?

To remove “potential” defects directly out of the equation is not a good practice, as this would ignore a “real” issue that was actually picked up during the common inspection process. The Commission services however acknowledge that a potential defect can be excused from being considered a true defect, in case all of the following conditions are met:

  1. the type of potential defects can be described through well specified criteria
  2. its existence is inherent to the particular reference parcel type
  3. the defect doesn't jeopardize the farmer declaration and administrative crosscheck procedures.

The possibility to apply one “waiver” does not entitle the MS to automatically “whitewash” such parcels or to ignore to report on a parcel that carries other defects or indications of non-conformance.

Methodologically, the original number of potential critical defects will be reported at one –intermediate- stage, whereas only the number of remaining true defects -where no pre-defined “waiver” is applicable- will be used for the acceptance decision. Or, if no other potential defect is present on the inspected parcel, the applicable waiver(s) shall be reported but the parcel remains "conforming".

The Commission services proposes a variety of "waivers" and their conditions; the Member State shall indicate within their ATS-ICS which of these “waivers” are applicable.

Guidelines will be adapted accordingly.

12. Why is a parcel non-conforming if (a part of) its boundary is not visible, even if the area encloses eligible land?

This is a particular type of “potential critical defect”.

In general, the “non-conforming” status is attributed to an inspected parcel if either it has a critical defect or if the eligible area found exceeds the conformity level. These conditions act independently.

Parcels with unclear boundaries do have a serious defect: the boundary of the LUI cannot be identified and hence the area cannot be measured via the common inspection method. For this reason they are non-conforming.

On the other hand,

  1. visual inspection to excuse this “potential defect” is currently allowed if the local field conditions cannot challenge the statement that the LUI “encloses eligible land”. Practically, in absence of any measure of absolute positional accuracy in the ETS, the presence of any ineligible feature within 5 meter of the perimeter of the LUI constitutes a challenge to that statement.
  2. for AP, FB and CP RP-types, specific waivers are introduced, specifying the external and local conditions to be verified for a vindication of this potential critical defect.

Parcels with a potential critical defect "Inability to identify LUI boundary" that are excused by either of the two mechanisms above (unchallenged visual inspection or application of an appropriate waver) are still considered conforming.

13. When is there a need to do a LPIS “refresh”?

In the discussion document it is also written “systematic refresh using appropriately recent data source (in preference ortho-imagery) should be investigated”.

The Commission services will prepare some documents on the refresh issue in the course of the next year. The findings of the 2010 LPIS quality assessment can be considered in the guidance to be delivered.

14. What is the meaning of and reason for this rate of irregularities from OTSC? If it is a pure IACS query over the whole population, it is already reported to the Commission services

The rate of irregularities from OTSC can be the result of a poorly functioning LPIS. As the LPIS should reflect agricultural reality with regard to the eligibility of the land, ideally, the OTSC should not detect a substantial amount of “additional errors". If OTSC does detect significantly higher error rates from year to year, it can indicate the failure of the member state to address LPIS issues. This causal relation is not present in the existing reporting.

15. There is no added value for AP and FB to perform the ETS

Experience shows that reference parcels, based on AP and FB are not always as “pure” as the member state assumes. The ETS, when correctly performed, will allow the identification of issues by systematically comparing the real world with information recorded in the LPIS.

It is agreed that with regard to the interpretation of the results on certain ratios a distinction between the different systems could be needed. This will be evaluated at the end of the first year's exercise

16. What will the Commission services do with the LPIS QA results?

It is important to point out that the exercise is above all a self-assessment exercise. It is a tool for the Member States to evaluate the situation of its LPIS and to determine the actions to be taken to remedy any problematic situation.

This is why the report, apart from the ratios as such, should focus on an analysis of the ratios and on an action plan indicating what measures will be taken to remedy the deficiency established as well as the time line by which this will be done.

As with any quality assurance approach, it allows the member states to be proactive.

The scoreboard results as such will not trigger the application of financial corrections. Although the scoreboard results are important, the Commission services are more interested in the actions that will be proposed to remedy the problems found.

Working group findings: experiences and suggestions

ATS and LPIS model elements experiences

  • ”farmer area” not clear: Does it imply the declared, claimed area? > TOPIC 1 + WikiCAP FAQ
  • Subdivision of reference parcels (i.e. parcels and farmers blocks): Do legal boundaries indentify sufficiently the subdivision of reference parcels? > TOPIC 11
  • Rationale: ATS and ETS should be regarded as depending instruments…> TOPIC 1
  • Eligibility profile
  • Can we create new (MS related) LCC codes? > WikiCAP FAQ + Annex III
  • Can the same temporary features be mapped to different LCC? > TOPIC 5
  • ATS reporting
  • What should the ATS include (feature catalogue schema) > WikiCAP guidelines
  • Is there any deadline to send it to EC? > (EC) 2009R1122: 28/2/2011

ATS and LPIS model elements suggestions

  • ATS to the EC before starting the ETS? In order to validate the MS ATS performed it might be of advantage to send preliminary ATS-reports to the Commission services. Does the JRC agree to give feed-back In case No 2= “yes” what is the deadline for sending the ATS to EC? > to be discussed on bilateral basis
  • FC/AS issues
  1. Make the application schema more understandable by adding practical examples . MS need an instruction. > TOPIC 1
  2. More examples on feature catalogue
  3. More details on the LPIS core model > TOPIC 1
  4. The term “validity status” is not clear: that needs clarification > TOPIC 1
  5. What is the expected format to send the UML schema? INSPIRE/JRC uses EAP
  • ATS: We need one or more clear use cases

GML and data exchange experiences

  • GML is suitable format for exchanging spatial data
  • BUT
  • requires knowledge,
  • big files with not much information in it > TOPIC 3

GML and data exchange suggestions

  • schemas as simply as possible, > TOPIC 1
  • MS still waiting for rest of XML/GML schemas…> In progress, introducing some innovations from the > TOPIC s above
  • consider the possibility to receive/publish the orthoimagery via WMS
  • consider the possibility to receive/publish the ETS results via web services

CAPI Inspection experiences

  • PLUS:
  • More efficient and less expensive than OTSC
  • Quality control more easy
  • BUT
  • Repeatability
  • interpreting is agent dependent > TOPIC 7
  1. Requires different competences than OTSC
  2. Large experience with CAPI operator : More representative sample
  • interpretation is image dependent , > TOPIC 6
  1. Lack of quality reference
  2. Positional accuracy of the images
  • labour intensive. As all parcels have to be re-digitized. > TOPIC 7
  • Not taking into account the changes from the farmers in the LPIS > TOPIC 2

CAPI Inspection suggestions If there are no clear deviations from the RP registered in LPIS it is not necessary to re-digitize the whole RP again. This will always lead to some deviation of the RP registered in LPIS, but is only due to slightly different operator actions. What is the reason of the systematic redraw? > TOPIC 7

Population and sampling experiences

  • Definition of “total population” is not clear:
  • Why to send total population to JRC for pre-sampling? > TOPIC 3
  • What about RP that hold only LF? If these can under no circumstances be attributed to a single traditional reference parcel, they should be inspected separately. Note that such LF-only RP must therefore immediately border two distinct traditional reference parcels’’’

Population and sampling suggestions

  • Change definition of total population: All reference parcels declared by farmers (if followed up in IACS-processes) or with non-zero eligible area
  • Problem: Time gap between relevant date for the sample (equal to date of pre-printed forms) and the date for taking of the photo
  • Solution 1 (preferred): if focus is placed on eligibility of parcels, the changes to reference parcels would not affect the test (or only very little)
  • Solution 2: Instead of doing the ETS on the version of the reference parcels as pre-printed on the forms take the version as declared by the farmer Problem: this way it becomes possible to cleanup the parcels before sending the sample to JRC
Then in GML-file date of last update should be recorded

CwRS imagery use and alternatives experiences

  • Elevation Angle problem (IKONOS) > TOPIC 6
  • Shadows are a problem, time of acquisition in Southern Europe
  • Hilly Terrain - Southern Europe
  • Small parcels (easily goes over threshold - areal uncertainty)
  • Problem to determine temporarily non-agricultural activities on 1 image set > TOPIC 5

CwRS imagery use and alternatives suggestions

  • ETS test on imagery of same quality as the LPIS at least similar was also suggested
  • Input data the same for all MS ? > TOPIC 6
  • Threshold 97 -103 % too strict > TOPIC 10
  • CwRS flexible season > TOPIC 6
  • Option for orthophotos paid by JRC budget? No, this is legally not possible. The VHR image acquisition uses a DGAgri budget sub-delegated to JRC and is governed by Council Regulation 165/1994
  • New data or zones for ETS test? > TOPIC 6

Eligibility profile and Land cover experiences

  • unclear what ancillary data can be used: application data, cadastre data, OTSC (rapid field visits), other (ortho)images
  • define
  • temporarily not used areas, example: wasteland close to canal > TOPIC 5
  • when images tilt over different inspectors will come to different polygons > TOPIC 6
  • cases with long boundaries of phys. block where 23 cm on 50 cm images decide between conformance > TOPIC 6

Eligibility profile and Land cover suggestions

  • ETS is/should be an eligibility matter and not a LCC list; why digitalize up to 10 different elements in 1 physical block? > TOPIC 4
  • If MS must not split into non-eligible vs. eligible, JRC should explicitly allow creation of MS-specific LCC e.g. put arable land together with grassland plus MS-specific critical defects > TOPIC 4

QE1 experiences

  • Temporary features: Are they possible (eligible)? Can they be mapped by MS own decisions? In Case temporary features have to be mapped and sent: Can MS create an own LCC for it? > TOPIC 5
  • What has to be reported to the Commission services? Everything or just results? > WikiCAP guidance
  • Can we define property boundaries or other “hints” to determine boundaries between parcels? > TOPIC 8 + > TOPIC 1.12

QE2 experiences

  • good indicator for LPIS technical quality
  • BUT
  • strong dependency on imagery quality and experience of operator, > TOPIC 6
  • very critical on small parcels, > TOPIC 10
  • not considering temporary ineligible features (!)> TOPIC 5

QE2 suggestions

  • consider treating temporary ineligible features as eligible features according to historical imagery > TOPIC 5

QE3 experiences

  • Conformance level between 97 % and 103 % (quality aspect 2) is inappropriate for very small RPs and for long narrow RPs. Differences between resolutions of VHR > TOPIC 10
  • Slow process > TOPIC 4

QE3 suggestions

  • If those generic processes are to be exhaustive, they must refer to realistic and concrete situations > TOPIC 11
  • More flexibility with adaptation to different cases > TOPIC 11
  • Ineligible features with area > 0,3 ha is considered as an major error. > TOPIC 9
  • Threshold could function of the size and of the shape. Use buffer tolerance ? > TOPIC 10
  • Other categorisation used if the area is overestimated or less estimated > TOPIC 9
  • The 3% conformance level should be replaced by a ‘tolerance’ based on the perimeter (as used in the on the spot controls). > TOPIC 10
  • To delineate only the eligible features and not to draw the LCCS. > TOPIC 4

QE4 experiences

  • potential critical defects: all the potential critical defects need to be investigated if they are really critical defect, but the quality element is still based on potential critical defects instead of real ones??? > TOPIC 11
  • Definition of the potential critical defects isn’t always clear. E.g. Potential discontinuity -> example page 56 in guidelines
  • The critical defect about “unclear boundaries between eligible agricultural land” is considered not relevant; overlaps with adjoining parcels would reveal it > TOPIC 12

QE4 suggestions

  • Drop ”unclear boundaries” > TOPIC 12
  • General remark: the entire ATS and ETS should be simplified a lot > TOPIC 1

QE5 experiences

  • Careful farmer declares less than maximum eligible
  • Does not give information on quality of LPIS (Agri Parcel/Farmer Block) > TOPIC 15
  • Declared area independent from LPIS > TOPIC 9
  • Clarification of declared area vs determined (observed) area / *SPS – entitlements /

SAPS – number of hectares > TOPIC 1

QE5 suggestions The ratio of declared area to the maximum eligible area should take into account the type of RP used by the Member state. RP : Physical blocks - large problem

QE6 experiences

  • normal changes in parcels summed up to non-conformance (applications …); plus: different reference systems have different reasons for changing polygons i.e. why has the parcel changed? what is a “land change”? > permanent physical changes of the land that impact on the IACS in general and eligibility of the land in particular
  • what does “refresh” mean/imply? plus: if the changes in parcel is part of update process (administrative process) is this to be counted as land change? how to deal with reaching the 25 % i.e. when do refresh? > TOPIC 13
  • If all QEs reach thresholds but QE 6 does not, why do a total refresh? > TOPIC 13

QE7 experiences

  • conformance level: “the OTSC rate of irregularities shall not exceed 2 % AND shall not be higher than the rate observed in the preceding application year” Problem: what does it mean? always not higher than preceding year, even when already below 2 %? fluctuations are normal > TOPIC 14
  • what does OTSC mean? which OTSC? [take entitlements into account] question of calculating this? > those who involve AREA NOT FOUND

technical aspect of how to compare inspectors polygons with the sample? Again: is the sample relevant or is it to be done nationwide?

QE7 suggestions

  • Somebody explicitly explain guideline for QE 7