-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Description
This project proposes extending MapSwipe to identify and categorise the tentative area of an object using tile server. The output can be used for training AI models. It builds on existing MapSwipe capabilities and aligns with fAIr’s training needs, providing a scalable approach for creating AI-ready datasets. The resulting training datasets generated by MapSwipe will be consumed by fAIr to train and improve its AI models.
High Level Workflow
The below bullet points explain the high-level workflow of fAIrSwipe:
- fAIr provides an Area of Interest (AOI) and classification requirements.
- A new project is created in MapSwipe based on the AOI.
- MapSwipe users review imagery tiles and identify and classify footprints using tile-server-based workflows.
- Structured results are exported as training datasets.
- fAIr consumes these datasets to train and improve AI models.
Implementation Detail
A new project type will be created through the Manager Dashboard. Project managers will define the AOI, select the grid size and class.
Each task will present a single imagery tile per screen, which will be further virtually sub-grouped into smaller grid-based mini tiles based on project requirements.
Tiles will be divided into grids of 2×2, 4×4, and 8×8.
The example of a 4×4 virtually grouped grid below:

Users will interact directly with the virtual mini tiles displayed on the project screen.
- One tap on a mini tile (or group of mini tiles), highlighted in green, will indicate the presence of a single potential footprint.
- Two taps, highlighted in yellow, will indicate the presence of multiple potential footprints within the selected virtual mini tile(s).
The project output will capture information about the selected mini tiles for each screen, supporting downstream use in training data generation.