Quantcast
Channel: Active questions tagged jq - Stack Overflow
Viewing all articles
Browse latest Browse all 527

array of records without jq's "slurp"

$
0
0

Edit - I was not clear. My apologies.

  1. My customer is producing enormous json-like files, using automation. For this reason they can be enormous; tens of gigabytes or more; I cannot control these files in size or in content.

  2. The files aren't valid json; they tend to be sequential json records without separators between them. They look sort of like this:

    { "a":1, "b": 2, ... }{ "a":2, "b": 4, ... }{ "a":3, "b": 6, ... }

  3. Our software runs at customer site, autonomously, without my team being present after the initial setup.

  4. Customers have many files, and I have many customers. Custom coding is a last resort.

  5. I have jq in my environment. I would prefer to use what I already have.

Given the above set-up, I fear jq -s will load entire multi-gigabyte files into memory.

I need to convert the above semi-json into something valid like:

[{ 'a':1, 'b': 2, ... },{ 'a':2, 'b': 4, ... },{ 'a':3, 'b': 6, ... }]

and I would like to stream the json while I make this conversion to reduce resource consumption.

Using jq --slurp ".", the files are converted to the desired array-of-records. However slurp pulls the entire file into memory and that's not ok.

Using jq what's an alternative "streaming" method?


Viewing all articles
Browse latest Browse all 527

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>