Quantcast
Channel: Active questions tagged jq - Stack Overflow
Viewing all articles
Browse latest Browse all 657

How do I portably parse a JSON file using libjq?

$
0
0

Consider the following code snippet based on libjq.

#include <stdio.h>#include <stdlib.h>#include <jq.h>#include <jv.h>int main(void) {    jq_state *jq = jq_init();    if (!jq) {        fprintf(stderr, "Failed to initialize jq\n");        return 1;    }    const char *json_text = "{\"data\":\"abc\\u0000def\"}";    jv parsed = jv_parse(json_text);    if (!jv_is_valid(parsed)) {        fprintf(stderr, "Invalid JSON\n");        exit(0);    }    if (!jq_compile(jq, ".data")) {        fprintf(stderr, "Failed to compile jq filter\n");        exit(1);    }    jq_start(jq, parsed, 0);    jv data = jq_next(jq);    printf("jv_kind = %s\n", jv_kind_name(jv_get_kind(data)));    const char *str = jv_string_value(data);    int length = jv_string_length_bytes(data);    for (int i = 0; i < length; i++) {      printf("%c", str[i]);    }    return 0;}

On MacOS, this code outputs a bunch of null bytes:

./hello_jq | xxd00000000: 6a76 5f6b 696e 6420 3d20 7374 7269 6e67  jv_kind = string00000010: 0a00 0000 0000 0000                      ........

On Linux, this code outputs the expected value:

./hello_jq | xxd00000000: 6a76 5f6b 696e 6420 3d20 7374 7269 6e67  jv_kind = string00000010: 0a61 6263 0064 6566                      .abc.def

What is causing this discrepancy, and how do I get the proper behavior on MacOS?

On both systems, the jq cli tool produces the expected output in raw mode.


Viewing all articles
Browse latest Browse all 657

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>