I am making post request using Postman but geting this error:
validation failed: _id: Cast to ObjectId failed for value "
I am trying in this way:
My _id type is ObjectId in Schema:
So,How to cast ObjectId to _id field.
Thanks.....
Related
I am getting a very strange error today on a Salesforce instance with this SOQL query. This works in my other Salesforce instances but not this one.
select FIELDS(ALL) from Contact where Id = '003Dp000001i1X8IAI'
The error from Salesforce:
select FIELDS(ALL) from Contact where Id = '003Dp000001i1X8IAI'
^
ERROR at Row:1:Column:15
Invalid field: 'ALL'
Any idea why this wouldn't be working in this particular Salesforce instance?
I received this error when using Dapper. My code: "(dbConnection.QueryAsync("select * from AbpUsers", transaction: await GetDbTransactionAsync())).ToList()"
=> Error:
System.Data.DataException: 'Error parsing column 2 (ExtraProperties={} - String)'
InvalidCastException: Invalid cast from 'System.String' to 'Volo.Abp.Data.ExtraPropertyDictionary'.
When I select only column "ExtraProperties" from table "IdentityUser" is work fine. My code: "(dbConnection.QueryAsync("select ExtraProperties from AbpUsers", transaction: await GetDbTransactionAsync())).ToList()"
I have a column (Type TEXT) in my database where I store dynamic checklist items, stored in the following format: [{"checked":"true","nome":"CNH Colorida"},{"checked":"false","nome":"Contrato social"},{"checked":"false","nome":"Ăšltima fatura de energia"}]. I need to get which of them are checked, and which are not, using SQL. I used to have a PHP code that json_decode the string and then iterates it, but I want to be able to do this directly from the SQL. The PostgreSQL version is 9.4.
When I try to cast the returned row as json (select valor::json), nothing happens, and if I try to cast it to a json array (select valor::json[]), I get the following error:
SQL Error [22P02]: ERROR: malformed array literal: "[]"
Details: "[" must introduce explicitly-specified array dimensions.
By researches, I found that PostgreSQL arrays are defined with {}, and not []. If I translate the [] to {}, I get this error:
SQL Error [22P02]: ERROR: malformed array literal: "{{"checked":true,"nome":"Importado"},{"checked":false,"nome":"Finame"},{"checked":false,"nome":"MDA"}}"
Details: Unexpected array element.
What else can I try to get it working?
Once you have converted the text to json you need to unnest the array and extract the key values:
select d.v ->> 'nome' as name
from the_table t, json_array_elements(t.valor::json) as d(v)
where d.v ->> 'checked' = 'true';
Online example: https://rextester.com/TSRF14508
I am trying to insert a series of values from an array of jsonb into postgres. However, I get the error error: invalid input syntax for uuid:.
My data looks like so
[
{
code: "dwfwradsa",
purpose: "description",
id: uuid (real uuid, this is just a placeholder)
},
{repeat},
{repeat}
]
And the relevant part of my function is as follows. #codes is a parameter passed into the function.
INSERT INTO
"codes" (
"code",
"purpose",
"id"
)
SELECT
codes->>'code',
codes->>'purpose',
codes->>'id'::UUID -- this is a foreign key from another table
FROM jsonb_array_elements("#codes") codes
ON CONFLICT ON CONSTRAINT unique_code DO NOTHING;
Even casting seems to not fix the problem.
If I do not cast, I receive this error error: column "column_name" is of type uuid but expression is of type text.
I am trying to run an asynchronous query. When I submit it to Bigquery it answers me that the table does not exist (status is "DONE"). What is wrong?
jobData = {'configuration': {'query': {'query': query, 'createDisposition': 'CREATE_NEVER'}}}
logger.info('jobdata: %s', jobData)
insertResponse = self.service.jobs().insert(projectId = PROJECT_ID, body = jobData).execute()
if 'errorResult' in str(insertResponse):
logger.error('query insert failed: %s', insertResponse)
This worked a few weeks back but no more. The error I get seems to indicate that the table cannot be found (or is it some kind of temporary table and all is well?)
jobdata: {'configuration': {'query': {'createDisposition': 'CREATE_NEVER', 'query': 'SELECT TIMESTAMP_TO_MSEC(timestamp) AS timestamp, location, branch, platform, description, original, metrics.group, metrics.id, metrics.name, metrics.value, DATE(timestamp) AS date FROM [mm.autotest] WHERE DATEDIFF(CURRENT_TIMESTAMP(), timestamp) < 14'}}}
query insert failed: {u'status': {u'state': u'DONE', u'errors': [{u'reason': u'notFound', u'message': u'Not found: Table ava-backend:_24bc4b39cef74439ece7e6f9a41399c21fccacd8.anonev_c7HQskNXMdaxCBDXdnAtOzlSJ6M'}], u'errorResult': {u'reason': u'notFound', u'message': u'Not found: Table ava-backend:_24bc4b39cef74439ece7e6f9a41399c21fccacd8.anonev_c7HQskNXMdaxCBDXdnAtOzlSJ6M'}}, u'kind': u'bigquery#job', u'statistics': {u'endTime': u'1390899670652', u'creationTime': u'1390899670344', u'startTime': u'1390899670652'}, u'jobReference': {u'projectId': u'ava-backend', u'jobId': u'job_EowCD-kiOirnlyRAQjFJNFoRFUY'}, u'etag': u'"11dTZYgUnUwbk8emYQU9mVRTTLs/0mRkexpB4dgaqONWphRujyRJhqM"', u'configuration': {u'query': {u'createDisposition': u'CREATE_NEVER', u'query': u'SELECT TIMESTAMP_TO_MSEC(timestamp) AS timestamp, location, branch, platform, description, original, metrics.group, metrics.id, metrics.name, metrics.value, DATE(timestamp) AS date FROM [mm.autotest] WHERE DATEDIFF(CURRENT_TIMESTAMP(), timestamp) < 14'}}, u'id': u'ava-backend:job_EowCD-kiOirnlyRAQjFJNFoRFUY', u'selfLink': u'https://www.googleapis.com/bigquery/v2/projects/ava-backend/jobs/job_EowCD-kiOirnlyRAQjFJNFoRFUY'}
The query works fine in Bigquery web interface and if run synchronously with 'query' instead of 'insert'
Could it possibly be an authentication issue? When I run the query with "query" it works but I get a warning message:
Checking for id_token.
id_token verification failed: Can't parse header: '\xc9\xad\xbd'
Checking for oauth token.
Client ID is not allowed: 173893847593-pdkumgr5bhobh3tv6qhqqfvel2tvnns0.apps.googleusercontent.com
URL being requested: https://www.googleapis.com/bigquery/v2/projects/ava-backend/queries?alt=json
The issue is with 'createDisposition': 'CREATE_NEVER'. This tells BigQuery that you never want to create a table for the query results. So it fails when the query result table doesn't exist. If you remove that create disposition, it should work.