Skip to content

Handle blocking input that doesn't cover all encodings #627

@hardbyte

Description

@hardbyte

blocklib filters out records according to the blocking specification. It warns if all records are not included in a block after using a particular blocking schema, but as someone else (another party) may have produced the blocking schema it seems reasonable that the Anonlink service should accept the blocking data as given (even if it doesn't cover 100% of the records). Possibly the service should warn that not all records are covered, or do something else (put the strays into their own block?).

An alternative is clients could filter out the records that are not part of any block and not upload the CLK encodings for those records.

As an example here is a (terrible) blocking schema for the febrl 4 dataset that excludes a few records:

blocking_schema = {
        "type": "p-sig",
        "version": 1,
        "config": {
            "blocking-features": ['given_name', 'surname'],
            "filter": {
                "type": "ratio",
                "max": 0.1,
                "min": 0.01,
            },
            "blocking-filter": {
                "type": "bloom filter",
                "number-hash-functions": 10,
                "bf-len": 2048,
            },
            "signatureSpecs": [
                [
                    {"type": "characters-at", "feature": "given_name", "config": {"pos": [0]}},
                ],
                [
                    {"type": "characters-at", "feature": "surname", "config": {"pos": [0]}},
                ]
            ]
        }
    }

Blocklib notes that this could be an issue:

P-Sig: Warning! only 96.42% records are covered in blocks. Please consider to improve signatures
Statistics for the generated blocks:
	Number of Blocks:   37
	Minimum Block Size: 60
	Maximum Block Size: 475
	Average Block Size: 217.40540540540542
	Median Block Size:  207
	Standard Deviation of Block Size:  123.52293072306216
P-Sig: Warning! only 97.1% records are covered in blocks. Please consider to improve signatures
Statistics for the generated blocks:
	Number of Blocks:   39
	Minimum Block Size: 52
	Maximum Block Size: 456
	Average Block Size: 210.17948717948718
	Median Block Size:  193
	Standard Deviation of Block Size:  113.03838933250947

The Anonlink service then fails while importing the encodings:

2021-03-08 11:44:31 | File "/var/www/entityservice/tasks/encoding_uploading.py", line 77, in pull_external_data
-- | --
2021-03-08 11:44:31 | assert count == len(encoding_to_block_map), f"Expected {count} encodings in blocks got {len(encoding_to_block_map)}"
2021-03-08 11:44:31 | AssertionError: Expected 5000 encodings in blocks got 4982
2021-03-08 11:44:31 | [2021-03-07 22:44:31,870: ERROR/ForkPoolWorker-2] Task entityservice.tasks.encoding_uploading.pull_external_data[005bf363-1176-40cd-a5c6-3c9f27f18bb0] raised unexpected: AssertionError('Expected 5000 encodings in blocks got 4982')

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions