Splitting a field

GFID system allows splitting a field into multiple fields.

In GFID system, the split operation is not supported via an “ad hoc” endpoint but rather via the well-known POST /fields endpoint which has been enhanced, at this purpose, to accept an array of fields in input (in addition to the currently supported JSON input object).

So, if a client application wants to split a field, it will call the POST /fields endpoint with an array of fields in input and “autoreplace“ set to true for each of them, thus forcing the existing field to be set to expire and being replaced with the input fields in the array.

It is clients responsibility to ensure that the fields in input would split the existing field in the desired way, indeed GFID system will not apply any validation on if and how the input fields relate spatially with the field they are meant to split.

Prerequisites

Please note: write access to the API is available on request. Creating and updating fields is intended only for client applications with end users, who are able to confirm the validity and suitability of input data as a canonical field boundary. Before granting access we will discuss with you how to ensure your users understand the implications of updating the Global FieldID map of fields. We will also assign you a unique source name to be used in input to the write endpoints.

How to split a field

Below is the list of optional and mandatory steps to accomplish the split of a field.

Optional step: Call the POST /fields endpoint with dry_run set to true

When calling the POST /fields endpoint with an array of fields in input, the system will try to create all the fields of the array but, as soon as one field fails to be created, the process will be interrupted and an explanatory error returned. Indeed the creation of fields is atomic and if not all fields can be created than none of them will.

A field in the array may fail to be created, as an example, because it is too small. Another field may fail to be created because it is too big. Given that the POST /fields endpoint is implemented to follow the “fail fast“ paradigm, you will uncover these errors one by one i.e. after amending the error in the first field and resubmitting the array you will discover the error in the second field.

This process can be really time-consuming so, to allow you uncovering all the errors at once, we enhanced the POST /fields endpoint to introduce the dry_run functionality. By setting dry_run=true in input, you will be to simulate request outcome without applying changes to the data and thus knowing at once all what need to be amended to successfully execute the split.

Mandatory step: Call the POST /fields endpoint with the array of fields in input (dry_run set to false by default)

Call the POST /fields with an array of fields and setting “autoreplace” to true for each of them. Please find below a sample input payload:

[
  {
    "boundary": {
      "type": "Feature",
      "properties": {
        "varda:source_name": "string"
      },
      "geometry": {
        "type": "Polygon",
        "coordinates": [
          [
            [0, 0],
            [0, 0],
            [0, 0],
            [0, 0]
          ]
        ]
      }
    },
    "autoreplace": true
  },
  {
    "boundary": {
      "type": "Feature",
      "properties": {
        "varda:source_name": "string"
      },
      "geometry": {
        "type": "Polygon",
        "coordinates": [
          [
            [0, 0],
            [0, 0],
            [0, 0],
            [0, 0]
          ]
        ]
      }
    },
    "autoreplace": true
  }
]

The API will try to create all the fields in input. In case of any error during the process, the execution is stopped, no changes in data are applied, and the error is returned. Otherwise the existing field is invalidated and the new fields are created and assigned a brand new FieldID (the IDs of the split field is not re-used).

Important considerations

Split (and merge) operations are reserved to physical fields and do not have to be seen as a way for modelling crop zones. We do aim to allow you to model explicitly crop zones but in future implementations.