Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JSONB arguments not treated consistently #2012

Open
jasperblues opened this issue Nov 26, 2019 · 18 comments
Open

JSONB arguments not treated consistently #2012

jasperblues opened this issue Nov 26, 2019 · 18 comments

Comments

@jasperblues
Copy link

jasperblues commented Nov 26, 2019

According to docs, there is no special treatment required for JSONB bind arguments. In practice there is:

  • When the argument is a single object, it is bound correctly
  • When the argument is an Array, it is necessary to call JSON.stringify on the args.

Here's a test case that reproduces the issue:

const { Client } = require('pg');
const { inspect } = require('util');
async function doIt() {
    const client = new Client();
    console.log('Connecting...');
    await client.connect();
    console.log('Connected');
    const a = [
        {
            the: 'quick',
            brown: 'fox',
            jumps: 'over the',
            lazy: 'dog',
            some: { other: 'object', dilroy: ['hello', 'world'], numpty: new Date() }
        }
    ];
    console.log('Inserting...');
    const result = await client.query(`insert into sb_json(c1) values ($1) returning *`, [a]);
    console.log(`result = ${inspect(result, { depth: null })}`);
    console.log('Disconnecting...');
    await client.end();
}
doIt()
    .then(() => console.log('All done'))
    .catch(e => console.log(inspect(e)));

Current documented contract is that no special treatment is required.

Possible solution:

  • Document current behavior. This is the less ideal solution, as arguably it is better to be consistent. Never call JSON.stringify on args, or always do so. Not one case for arrays, and on case for single items.
  • Inspect the contents of the array. If it contains all primitive types insert for that type. Otherwise call JSON.stringify internally. This operation would run in O(n) time - for a very large array it would be slow.
@charmander
Copy link
Collaborator

Likewise sending an object to the PostgreSQL server via a query from node-postgres, node-posgres will call JSON.stringify on your outbound value, automatically converting it to json for the server.

With the existence of jsonb[] and without knowing the desired types ahead of time, no solution by inspecting JavaScript types can work.

See also #442

@jasperblues
Copy link
Author

Yeah, I see what you're saying. It's a bit funky that node-pg assumes to stringify Objects but not Arrays. I think making it more consistent, or rather, no incoming magic (I like the magic coming out though!) would be for the best. It seems a little arbitrary, especially now that JSON has been added to the data-type mix, that JS arrays are assumed to be PG Arrays, when they could be type json or even hstore.
My preference would be to make no assumptions and leave it up to the library consumer. I'm not sure if that means node-pg should JSON.stringify Arrays and Objects by default, as that would be trading one magic for another. But it certainly does seem like a sane default.

^-- This comment from the linked ticket (#442) resonates with me. Better to be consistent in behavior.

  • Never call JSON stringify.
  • Always call it. (This appears to be impossible).

@oakgary
Copy link

oakgary commented Jan 24, 2020

https://node-postgres.com/features/types
would be a good place to mention the current behaviour as it seems to be mentioned only in github issues right now

@Bessonov
Copy link

@jasperblues @charmander we are not intended to use pg arrays. Is there any way to say "treat javascript arrays always as pg json"? We use knex if it's matter.

@Bessonov
Copy link

I see the comment, probably from @brianc , but I don't see the way how to replace the array handling: https://github.com/brianc/node-postgres/blob/master/packages/pg/lib/utils.js#L47

@bodinsamuel
Copy link

Looked for this answer for so long.
If not fixable (not sure I understood the whole issue), this should at least be noted as a warning in the doc 👍

@brianc
Copy link
Owner

brianc commented Apr 1, 2020

As a workaround you could 'wrap' your arrays in an object and turn that into the JSONB text you want:

class JsonBArrayParam {
  constructor(actualArray) { this.actualArray = actualArray }
  toPostgres() {
    return JSON.stringify(this.actualArray)
  }
}

client.query('SELECT FOO WHERE Bar = $1', [new JSonBArrayWrapper(['one', 'two'])

or something similar to that. does that work?

@brianc
Copy link
Owner

brianc commented Apr 1, 2020

@Bessonov
Copy link

Bessonov commented Apr 1, 2020

Because we don't use pg arrays, we've implemented following workaround:

// @ts-ignore
import pgUtils from 'pg/lib/utils.js'

// a workaround to force storage of array as json until
// https://github.com/brianc/node-postgres/issues/2012
// is fixed
const originalPrepareValue = pgUtils.prepareValue
pgUtils.prepareValue = <T>(val: T): T | string => {
	if (Array.isArray(val)) {
		return JSON.stringify(val)
	}
	return originalPrepareValue(val)
}

Maybe it helps someone.

@markddrake
Copy link

Just hit this too. Ugly. Can we provide some kind of bind directive to tell PG how to interpret the array, or what the required behavior for a given insert statement is.

@boromisp
Copy link
Contributor

If the library would expose the types parameter for prepared statements, there might be a way to do conditional serialization based on the type.

The API could look something like this:

pool.query({
  text: 'SELECT $1, $2, $3', 
  values: ['1', 2, [1, 2, 3]],
  paramTypes: [null, null, OID.JSONB]
);

I'm not sure if it's any better then wrapping the values in custom serialized type as suggested in #2012 (comment).

@peey
Copy link

peey commented Apr 21, 2021

This and #1462 seem to be duplicates of each other

@rightaway
Copy link

Is it possible to use setTypeParser to intercept jsonb, inspect if it's an array, then stringify it? Which type id can be used for this?

@charmander
Copy link
Collaborator

@rightaway No, type parsers are for data coming from the PostgreSQL server. Serialization happens in utils.prepareValue, and it doesn’t have parameter type information.

@mariusa
Copy link

mariusa commented Apr 10, 2022

Thanks for the stringify workaround!
How could run this other query? stringify doesn't work in this case:
SELECT * FROM product where tags ?| ARRAY['tag1', 'tag2'];

I tried await pgclient.query(SELECT * FROM product WHERE tags ?| $1, [JSON.stringify(['tag1', 'tag2'])])
but the result is

SELECT * FROM product WHERE tags ?| ["tag1","tag2"]
ERROR unhandledRejection malformed array literal: "["tag1","tag2"]"

@boromisp
Copy link
Contributor

The workaround provided in #2012 (comment) only works for them, because they do not use any SQL array inputs.

The jsonb ?| text[] → boolean operator expects an SQL text array on the right hand side, not a json or jsonb array, so you shouldn't JSON.stringify your inputs in this case.

@mariusa
Copy link

mariusa commented Apr 11, 2022

Thank you!

mordae added a commit to mordae/node-postgres that referenced this issue Jan 22, 2025
Related to brianc#2012.

Since we cannot disambiguate between Arrays that should be converted to
PostgreSQL array types and Arrays that should be treated as JSON, make
it at least possible to define custom Array-derived types.

This also makes it possible to properly serialize Array-derived
multirange collections where elements must not be independently escaped.
@mordae
Copy link

mordae commented Jan 22, 2025

Because we don't use pg arrays, we've implemented following workaround:

// @ts-ignore
import pgUtils from 'pg/lib/utils.js'

// a workaround to force storage of array as json until
// https://github.com/brianc/node-postgres/issues/2012
// is fixed
const originalPrepareValue = pgUtils.prepareValue
pgUtils.prepareValue = <T>(val: T): T | string => {
	if (Array.isArray(val)) {
		return JSON.stringify(val)
	}
	return originalPrepareValue(val)
}

Maybe it helps someone.

If #3360 makes it, it would be possible to implement this behavior like this:

Array.prototype.toPostgres = function() { return JSON.stringify(this) }

It would also be possible to define custom Arrays that would Just Work. For example:

class DateMultiRange extends Array implements DateRange[] {
  toPostgres(prepareValue: (x: any) => any) {
    return `{${this.map((range) => prepareValue(range)).join(',')}}`
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests