Use PASSING with JSON_TABLE() To Make Calculations

I ran across this trick while preparing a talk for the Southern California Linux Expo  (SCaLE) on using JSON_TABLE(). This is a 'reserve' talk, which means it will presented IF another speaker drops out. But I thought this was interesting enough to share here. I will make sure the slides are posted online when finished.

Passing Interest

JSON_TABLE() temporarily converts JSON data into a relational table so that the temporary table can be processed by Structured Query Langauge (SQL) Commands. A feature in the PostgreSQL implementation is the passing clause.  In the example below, you will see PASSING 1500 AS price_limit in red. This declares price_limit as a variable and assigns a value of 1500 to it. 

Later, in the body of the JSON_TABLE() call, you will see in green is_too_expensive BOOLEAN EXISTS PATH '$.price ? (@ > $price_limit)'.  Here a variable is created named is_too_expensive of type BOOLEAN, which means we get a true or false answer from the rest of the declaration. If the price column has a value, it is compared with the price_limit value to see if that value is greater or not than the limit. 

demo=# SELECT *
FROM JSON_TABLE(
    '{"guitars":[{"item": "Fender Telecaster", "price": 1000}, 
                 {"item": "Gibson Les Paul", "price": 2000},
                 {"item": "PRS semi-hollowbody", "price": 1500}]}',
    '$.guitars[*]' PASSING 1500 AS price_limit
    COLUMNS (
     row_num for ordinality,
        Guitar TEXT PATH '$.item',
        Price INT PATH '$.price',
        is_too_expensive BOOLEAN EXISTS PATH '$.price ? (@ > $price_limit)'
    )
) AS my_example;

 row_num |       guitar        | price | is_too_expensive 
---------+---------------------+-------+------------------
       1 | Fender Telecaster   |  1000 | f
       2 | Gibson Les Paul     |  2000 | t
       3 | PRS semi-hollowbody |  1500 | f
(3 rows)
demo=# 

How Does This Help You?

JSON data is tough on relational databases as it can not be processed with the regular operators found in SQL without special calls. Those calls can slow down a query tremendously. We could have pulled the data from JSON and then used another query to make our quantitative determination. But that is another query on top of the JSON_TABLE() call, and two queries are going to be slower than one. Or we could have used some sort of case statement. But this is such a simple calculation that it seems tragic to separate it from the rest of the data.

This is a quick and 'not dirty' to handle the data.

And if you are at SCaLE, please day 'hello!'


Comments

Popular posts from this blog

How Does PostgreSQL Store Your Data?

Now That We Know Where PostgreSQL Stores Data, We Can Look At How

How PostgreSQL's Aggregate FILTER Will Spoil You