Having a not-so-basic query to make in a Node.js project using Sequelize, I went straight to writting SQL instead of having to figure out all the ORM mambo jambo that would be required.

This query had columns that resulted of SUM() aggregations. Turns out, SUM() returns “bigint for smallint or int arguments, numeric for bigint arguments, otherwise the same as the argument data type”". This is, by default in pg, represented as a JavaScript string.

Which sucks.

Casting in SQL

Casting BIGINT to INT is simple enough in psql:

postgres=# SELECT SUM(1), pg_typeof( SUM(1) );
 sum | pg_typeof 
-----+-----------
   1 | bigint
(1 row)


postgres=# SELECT CAST(SUM(1) AS INTEGER), pg_typeof( CAST(SUM(1) AS INTEGER) );
 sum | pg_typeof 
-----+-----------
   1 | integer
(1 row)

Configure pg for int8 parsing

When initializing Sequelize, do this before (as descibed in this PR):

require('pg').defaults.parseInt8 = true
const { Sequelize, DataTypes } = require('sequelize')

Solved.