Working with the JSON data format
Downstream Processing
In addition to the tools that output and format JSON data for the user are several tools that provide downstream processing of JSON as input in other programs (post-processing).
If you receive JSON data via an interface, it is good programming practice to sanity check the received data before further processing. The sanity check includes two stages:
- Syntactic correctness – is the spelling correct? Do all brackets (equal number of opening and closing brackets), commas, and quotation marks fit the bill?
- Correctness of data fields – does the received data structure match the data definition (JSON schema)?
For the first question, it is best to use JSONLint, which is described earlier in this article. For the second stage, you need the JSON schema that describes the data structure. You then compare this description with the received data.
On json-schema.org, you will find an overview of validators [15], sorted by the various programming languages in which they were developed. For example, consider the validate-json
tool implemented in PHP [16]. If you are more into Python, jsonschema
[17] serves the same purpose. The call to the two tools is identical.
Defining the JSON Schema
Listing 10 shows the JSON schema with which you define the exact format of your data structure. The schema matches the book inventory used earlier in this article. The schema was stored in the book-inventory-schema.json
file in the local directory.
Listing 10
JSON Schema
{ "$schema": "http://json-schema.org/draft/2019-09/schema", "title": "Book", "type": "object", "required": ["author", "title", "publication"], "properties": { "author": { "type": "string", "description": "The author's name" }, "title": { "type": "string", "description": "The book's title" }, "publication": { "type": "number", "minimum": 0 }, "tags": { "type": "array", "items": { "type": "string" } } } }
The schema definition references the JSON standard used (the draft from September 2019, in this case) in the second line. The definition contains a number of keywords. Table 3 explains these keywords in more detail; a complete list of all supported keywords is available at json-schema.org [18].
Table 3
JSON Keywords
Keyword | Description |
---|---|
$schema |
Description of the schema specification |
title |
Title of the schema |
type |
Type of JSON data |
properties |
Properties of each value (key and values allowed for the field) |
required |
List of required properties |
properties.type |
Data type of an entry |
properties.minimum |
Minimum value of an entry |
properties.maximum |
Maximum value of an entry |
properties.minLength |
Minimum number of characters for an entry |
properties.maxLength |
Maximum number of characters for an entry |
properties.pattern |
Regular expression for a comparison with the value of an entry |
The next task is to validate the records by checking whether they match the specified schema. Listing 11 shows a single record from the book inventory in readable format. The compact version of the record contains all the parentheses and fields in a single line.
Listing 11
JSON Record
{ "author": "Stephen Fry", "title": "The Hippopotamus", "publication": 1994 }
The validate-json
tool expects two parameters in the call, the dataset and the schema (Listing 12). If everything goes well, the output does not cause any further feedback (line 2); otherwise, validate-json
grumbles (lines 4 and 5). To provoke the error message starting in line 4, we turned the numeric specification for the year of publication (1994
) into a string "1994"
, which means that the data type in the dataset no longer matched the stored data type in the JSON schema. validate-json
has every reason to complain.
Listing 12
Calling validate-json
01 $ validate-json record.json bookinventory-schema.json 02 $ 03 $ validate-json record.json bookinventory-schema.json 04 JSON does not validate. Violations: 05 [publication] String value found, but a number is required
Some programming languages also offer suitable helper libraries. In Python, for example, you can use jsonschema
, and for NodeJS, you can use the Express framework.
Processing JSON
The list of command-line tools and helpers that read, search on, and modify JSON data is quite extensive. We stopped counting after more than 20 entries (see Table 4 for a sample). Developer Ilya Sher maintains a useful, commented overview of options [19].
Table 4
Command-Line Tools
Tool | Application (selection) |
---|---|
faq, Xidel |
Convert formats from and to JSON (BSON, Bencode, JSON, TOML, XML, YAML, etc.) |
fx, gofx, jq, jid |
Filter JSON data |
jello |
Filter JSON data with Python syntax |
jtbl |
Output to a table |
Underscore |
Processing via the command line |
Jtbl, for example, takes JSON records and knits a pretty table from them. In Figure 7, you can see how this table looks for the book inventory. Each record is shown in a separate row. Jtbl can only cope with flat JSON structures. It cannot handle nesting so far.
« Previous 1 2 3 4 Next »
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Endless OS 6 has Arrived
After more than a year since the last update, the latest release of Endless OS is now available for general usage.
-
Fedora Asahi 40 Remix Available for Macs with Apple Silicon
If you've been anticipating KDE's Plasma 6 for your Apple Silicon-powered Mac, then you're in luck.
-
Red Hat Adds New Deployment Option for Enterprise Linux Platforms
Red Hat has re-imagined enterprise Linux for an AI future with Image Mode.
-
OSJH and LPI Release 2024 Open Source Pros Job Survey Results
See what open source professionals look for in a new role.
-
Proton 9.0-1 Released to Improve Gaming with Steam
The latest release of Proton 9 adds several improvements and fixes an issue that has been problematic for Linux users.
-
So Long Neofetch and Thanks for the Info
Today is a day that every Linux user who enjoys bragging about their system(s) will mourn, as Neofetch has come to an end.
-
Ubuntu 24.04 Comes with a “Flaw"
If you're thinking you might want to upgrade from your current Ubuntu release to the latest, there's something you might want to consider before doing so.
-
Canonical Releases Ubuntu 24.04
After a brief pause because of the XZ vulnerability, Ubuntu 24.04 is now available for install.
-
Linux Servers Targeted by Akira Ransomware
A group of bad actors who have already extorted $42 million have their sights set on the Linux platform.
-
TUXEDO Computers Unveils Linux Laptop Featuring AMD Ryzen CPU
This latest release is the first laptop to include the new CPU from Ryzen and Linux preinstalled.