How can I have the first column automatically populated with specific values on each query of the formula?
This is the formula I had started with: ={"COGS",query(Sheet30!A2:D,"select * ")}, thinking that COGS would be in the rows below, but if I have more than one row in the range POs!J2:V, it already throws an error.
This is how it should look like:
Here is a sheet for test, in case you you feel like it.
use:
=QUERY({
QUERY(Sheet30!A2:D, "select 'COGS',A,B,C,D label 'COGS''Type'", 1);
QUERY(Sheet30!A3:D, "select 'Expenses',A,B,C,D label 'Expenses'''", 0)},
"where Col2 is not null", 1)
I've spent a good hour or so googling and trying various combinations but without success.
I wish to select from a table where one of the columns is single dimensional array of varchar(255).
In normal SQL I use the following query:
SELECT * FROM customers WHERE email_domains #> '{"#google.com"}';
That works perfectly.
But now I want to do the same from code. So I have tried this:
domain = '#google.com'
sql = "SELECT * FROM customers WHERE email_domains #> '{%s}';"
cursor.execute( sql, [domain] )
result = cursor.fetchall()
and a whole load of various combinations of escaped ' and " but I cannot get it to work.
The error I get is this:
ERROR: malformed array literal: "{"
LINE 1: ... * FROM customers WHERE email_domains #> '{'#goo....
^
DETAIL: Unexpected end of input.
All help appreciated :)
Ok. At times I amaze myself with my stupidity...
The solution (for me) was to simply restructure my sql to this:
SELECT * FROM customers WHERE %s = ANY(email_domains);
Thanks to everyone who offered suggestions.
psycopg2 converts lists to arrays for you. I can't test this easily at the moment, but I believe this should work:
domain = ['#google.com']
# Let psycopg2 do the escaping for you, don't put quotes in there
sql = "SELECT * FROM customers WHERE email_domains #> %s;"
cursor.execute( sql, [domain] )
result = cursor.fetchall()
The data in my column user_likes is as follows: anime^sitcom^scifi. I am trying to get the first item in the user likes column as output.
My query is as follows:
SELECT split(user_likes,"\\^")[0] as likes from user_data_consolidated
but the output of this query is:
anime^sitcom^scifi
SELECT substr(user_likes,1,((locate('^',user_likes )-1)))
as likes from user_data_consolidated
I tried to run below query in DB2 database:
My date string: 122887 mmddyy
select DATE(TO_DATE('122887', 'mmddyy')) from SYSIBM.dual;
now result is: 2087-12-28
But i am expecting below 1987-12-28.
How to achieve this?
You need to use the "adjusted year" for your query. Instead of YY it is RR:
values(DATE(TO_DATE('122887', 'mmddrr')))"
1
----------
12/28/1987
Details are in the documentation for TO_DATE/TIMESTAMP_FORMAT.
I have many json arrays stored in a table (jt) that looks like this:
[{"ts":1403781896,"id":14,"log":"show"},{"ts":1403781896,"id":14,"log":"start"}]
[{"ts":1403781911,"id":14,"log":"press"},{"ts":1403781911,"id":14,"log":"press"}]
Each array is a record.
I would like to parse this table in order to get a new table (logs) with 3 fields: ts, id, log.
I tried to use the get_json_object method, but it seems that method is not compatible with json arrays because I only get null values.
This is the code I have tested:
CREATE TABLE logs AS
SELECT get_json_object(jt.value, '$.ts') AS ts,
get_json_object(jt.value, '$.id') AS id,
get_json_object(jt.value, '$.log') AS log
FROM jt;
I tried to use other functions but they seem really complicated.
Thank you! :)
Update!
I solved my issue by performing a regexp:
CREATE TABLE jt_reg AS
select regexp_replace(regexp_replace(value,'\\}\\,\\{','\\}\\\n\\{'),'\\[|\\]','') as valuereg from jt;
CREATE TABLE logs AS
SELECT get_json_object(jt_reg.valuereg, '$.ts') AS ts,
get_json_object(jt_reg.valuereg, '$.id') AS id,
get_json_object(jt_reg.valuereg, '$.log') AS log
FROM ams_json_reg;
I just ran into this problem, with the JSON array stored as a string in the hive table.
The solution is a bit hacky and ugly, but it works and doesn't require serdes or external UDFs
SELECT
get_json_object(single_json_table.single_json, '$.ts') AS ts,
get_json_object(single_json_table.single_json, '$.id') AS id,
get_json_object(single_json_table.single_json, '$.log') AS log
FROM ( SELECT explode (
split(regexp_replace(substr(json_array_col, 2, length(json_array_col)-2),
'"}","', '"}",,,,"'), ',,,,')
) FROM src_table) single_json_table;
I broke the lines up so that it would be a little easier to read.
I'm using substr() to strip the first and last characters, removing [ and ] . I'm then using regex_replace to match the separator between records in the json array and adding or changing the separator to be something unique that can then be used easily with split() to turn the string into a hive array of json objects which can then be used with explode() as described in the previous solution.
Note, the separator regex used here ( "}"," ) wouldn't work with the original data set...the regex would have to be ( "},\{" ) and the replacement would then need to be "},,,,{" eg..
split(regexp_replace(substr(json_array_col, 2, length(json_array_col)-2),
'"},\\{"', '"},,,,{"'), ',,,,')
Use explode() function
hive (default)> CREATE TABLE logs AS
> SELECT get_json_object(single_json_table.single_json, '$.ts') AS ts,
> get_json_object(single_json_table.single_json, '$.id') AS id,
> get_json_object(single_json_table.single_json, '$.log') AS log
> FROM
> (SELECT explode(json_array_col) as single_json FROM jt) single_json_table ;
Automatically selecting local only mode for query
Total MapReduce jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
hive (default)> select * from logs;
OK
ts id log
1403781896 14 show
1403781896 14 start
1403781911 14 press
1403781911 14 press
Time taken: 0.118 seconds, Fetched: 4 row(s)
hive (default)>
where json_array_col is column in jt which holds your array of jsons.
hive (default)> select json_array_col from jt;
json_array_col
["{"ts":1403781896,"id":14,"log":"show"}","{"ts":1403781896,"id":14,"log":"start"}"]
["{"ts":1403781911,"id":14,"log":"press"}","{"ts":1403781911,"id":14,"log":"press"}"]
because get_json_object doesn't support json array string, so you can concat to a json object, like this:
SELECT
get_json_object(concat(concat('{"root":', jt.value), '}'), '$.root')
FROM jt;