#322 — September 11, 2019

Read on the Web

Postgres Weekly

A Primer on Postgres Upgrade Methods — Postgres 12 is just around the corner, so it’s worth revisiting what’s involved in upgrading your Postgres installation. Kaarel looks at the pros and cons of three different approaches.

Kaarel Moppel

A Discussion of High Availability Options for Postgres in Containers — Clustered containers running Postgres require a controller to monitor and manage the cluster. Find the right one for your organization.

EnterpriseDB sponsor

Managing Postgres Users and Roles — This is a few months old but we missed it at the time. That’s a shame, because it’s a really neat introduction to working with users and roles in Postgres.

Yaser Raja

Vacuum Improvements in Postgres 12 — Postgres users sometimes complain about VACUUM maintenance tasks. Well Postgres 12 is making these just a bit less painful, and easier to use frequently.

Shaun Thomas

What is an SQL 'Relation'? — This is a short but sweet introduction to the concept of what a ‘relation’ is. It’s not just the persistent tables you’re familiar with but “the truth is that any SQL query defines a new relation.” Dmitri has also released some related articles this week reintroducing us to joins and aggregates.

Dimitri Fontaine

Get More Out of Your Postgres Database — pganalyze summarized their learnings on how to get a 3x performance improvement on their Postgres database. Lots of helpful best-practices.

pganalyze sponsor

Tuning DB Parameters for Postgres 12 in Amazon RDS — Last week, Jignesh showed us how to use the beta of Postgres 12 on Amazon RDS, now he looks at how to tweak its settings for better performance.

Jignesh Shah

How to Deploy and Manage Postgres on OpenShift using the ROBIN Operator

Ankur Desai

Installing pgAdmin 4 on Debian 10

Shahriar Shovon

Replicating PostgreSQL into MemSQL's Columnstore — How to get the high performance of MemSQL’s columnstore for queries while keeping transaction data in PostgreSQL for updates.

Oryan Moshe

supported by Chaos Conf

💡 Tip of the Week

Turning query results into JSON

Did you know that Postgres includes a function for turning query result rows into JSON? It's called row_to_json. Let's create a basic table containing records of two people, along with an array column of 'interests', to play with:

CREATE TABLE people
  (name text, age int, interests text[]);

INSERT INTO people (name, age, interests)
  VALUES
  ('Jon', 12, ARRAY ['Golf', 'Food']),
  ('Jane', 45, ARRAY ['Art']);

The most basic way to use row_to_json is by using the ROW row constructor like so:

SELECT row_to_json(ROW(name, age, interests)) FROM people;

row_to_json
-----------------------------------
{"f1":"Jon","f2":12,"f3":["Golf","Food"]}
{"f1":"Jane","f2":45,"f3":["Art"]}

We can see that the different types of data in the table (strings, integers, arrays) are reflected in the output, but the column names aren't! Luckily there's a better way to tackle this by using a subquery instead:

SELECT row_to_json(q1) FROM
  (SELECT * FROM people LIMIT 1) q1;

row_to_json
-----------------------------------
{"name":"Jon","age":12,"interests":["Golf","Food"]}

It has no relevance to our example here, but row_to_json has an optional second argument which 'prettifies' the output in certain circumstances – to enable it, you'd use row_to_json(q1, true) in the example query above instead.

This tip of the week is brought to you by Chaos Conf, the flagship event for all things Chaos Engineering and resilience. Learn more at chaosconf.io.

🗓 Upcoming Events