Quantcast
Channel: Active questions tagged jq - Stack Overflow
Viewing all articles
Browse latest Browse all 515

Check for duplicate keys in a json file

$
0
0

I have a json file with the following contents:

{"id1": {"key": "value"    },"id2": {"key": "value"    }}

I want to check that each top level key ie. id1, id2 is present only once in a file and if not produce an error. So something like

{"id1": {"key": "value"    },"id1": {"key": "value"    }}

must show as error.

Is there a way to do this with an json parsers like jq or json-glib-validate?

I came up with a pythonic solution, that works, but would be nicer to have an actual parser.

This is supposed to be used in CI.

import collectionsimport jsonimport sysdef check_duplicates(pairs):    count = collections.Counter(i for i,j in pairs)    duplicates = ", ".join(i for i,j in count.items() if j>1)    if len(duplicates) != 0:        print("Duplicate keys found: {}".format(duplicates))        sys.exit(1)def validate(pairs):    check_duplicates(pairs)    return dict(pairs)with open("file.json", "r") as file:    try:        obj = json.load(file, object_pairs_hook=validate)    except ValueError as e:        print("Invalid json: %s" % e)        sys.exit(1)

Viewing all articles
Browse latest Browse all 515

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>