So i have this struct
type MessagesStored struct {
MessagetoStream string
StreamSub string
PubName string
CreatedAt time.Time
}
And i have a table in Postgres to match the struct
CREATE TABLE "StoredMessages"
(
"Message" character varying(200) COLLATE pg_catalog."default" NOT NULL,
"PubName" character varying(50) COLLATE pg_catalog."default" NOT NULL,
"Subject" character varying COLLATE pg_catalog."default" NOT NULL,
"CreatedAt" timestamp with time zone
)
So i am trying to select all the rows where a subject matches a string and store them in an array of the struct
var OldMessages []MessagesStored
db.Query("SELECT * FROM StoredMessages WHERE Subject = ", FilterSub)
for rows.Next() {
var oldmessages MessagesStored
errrws := rows.Scan(&oldmessages.MessagetoStream, oldmessages.StreamSub, &oldmessages.PubName, &oldmessages.CreatedAt)
CheckError(errrws)
OldMessages = append(OldMessages, oldmessages)
}
```
Related
I created the following function upsert(insert or edit if it's already exist) in postgresql using pgadmin 4.
CREATE OR REPLACE FUNCTION upsert(
uname character varying(55),
fname character varying(55),
eml character varying(255),
psw character varying(265),
phonenbr character varying(55),
adrs character varying(300)
)
RETURNS table (j json) AS
$$
BEGIN
INSERT INTO users
VALUES (DEFAULT,uname, fname, eml, psw, phonenbr, adrs)
ON CONFLICT (username, firstname)
DO
UPDATE SET username = EXCLUDED.username, firstname = EXCLUDED.firstname,
email = EXCLUDED.email, password = EXCLUDED.password, phonenumber = EXCLUDED.phonenumber,
address = EXCLUDED.address, registrationdate=current_timestamp, subscriptionend =current_timestamp+ INTERVAL '1 month',stat='active';
END
$$
LANGUAGE 'plpgsql';
The issue is than when it's doing an insert, for 3 column the values are null but everything it's fine when it's doing an update.
He there a way to solved this issue without adding 3 more parameters to the function ?
the scrip for create the table:
CREATE TABLE users
(
id_user integer Generated Always as Identity,
username character varying(55) NOT NULL,
firstname character varying(55) NOT NULL,
email character varying(255) NOT NULL,
password character varying(255)NOT NULL,
phonenumber character varying(55)NOT NULL,
address character varying(300) NOT NULL,
subscriptionend timestamp without time zone ,
registrationdate timestamp without time zone,
stat status,
CONSTRAINT users_pkey PRIMARY KEY (username,firstname)
)
I find a solution by declaring and using variables inside the function:
CREATE OR REPLACE FUNCTION upsert(
uname character varying(55),
fname character varying(55),
eml character varying(255),
psw character varying(265),
phonenbr character varying(55),
adrs character varying(300)
)
RETURNS table (j json) AS
$$
DECLARE
regisdate timestamp without time zone;
subsend timestamp without time zone;
statu status;
BEGIN
regisdate:=current_timestamp;
subsend:=current_timestamp+ INTERVAL '1 month';
statu='active';
INSERT INTO users
VALUES (DEFAULT,uname, fname, eml, psw, phonenbr, adrs,regisdate,subsend,statu)
ON CONFLICT (username, firstname)
DO
UPDATE SET username = EXCLUDED.username, firstname = EXCLUDED.firstname,
email = EXCLUDED.email, password = EXCLUDED.password, phonenumber = EXCLUDED.phonenumber,
address = EXCLUDED.address, registrationdate = EXCLUDED.registrationdate, subscriptionend=EXCLUDED.subscriptionend,stat = EXCLUDED.stat;
END
$$
LANGUAGE 'plpgsql';
I have created one file format (CSV) and then one external table to load csv data from azure blob storage.
The external table is showing all columns as NULL except the "Value" column
File Format Code
COMPRESSION = 'NONE'
FIELD_DELIMITER = ','
RECORD_DELIMITER = '\n'
SKIP_HEADER = 0
FIELD_OPTIONALLY_ENCLOSED_BY = 'NONE'
EMPTY_FIELD_AS_NULL = FALSE
TRIM_SPACE = FALSE ERROR_ON_COLUMN_COUNT_MISMATCH = TRUE ESCAPE = 'NONE'
ESCAPE_UNENCLOSED_FIELD = '\134' DATE_FORMAT = 'AUTO'
TIMESTAMP_FORMAT = 'AUTO'
NULL_IF = ('NULL');
EXTERNAL TABLE CODE
CREATE OR REPLACE EXTERNAL TABLE EXT_DIM_TESTTABLE
(
COL1 VARCHAR (1000) AS (value:"COL1"::string),
COL2 VARCHAR (1000) AS (value:"COL2"::string),
COL3 VARCHAR (1000) AS (value:"COL3"::string),
COL4 VARCHAR (1000) AS (value:"COL4"::string),
COL5 VARCHAR (1000) AS (value:"COL5"::string),
COL6 VARCHAR (1000) AS (value:"COL6"::string)
)
WITH
LOCATION=#TESTSTAGE
AUTO_REFRESH = true
FILE_FORMAT = 'FILE_TESTFORMAT_CSV'
PATTERN='.*TEST_DATA.csv';
Now when I select * from EXT_DIM_TESTTABLE, all columns shows NULL except VALUE one,
VALUE column is coming as below, the column names are not taken as "Col1" / "Col2" etc. but the values are correct. Rest all columns are NULL
{
"c1": "TESTING",
"c2": "TESTING",
"c3": "TESTING",
"c4": "TESTING",
"c5": "TESTING",
"c6": "TESTING"
}
not sure what is missing here?
It seems you are using value:"COL1"::string incorrectly.
Can you try using below DDL for external table?
CREATE OR REPLACE EXTERNAL TABLE EXT_DIM_TESTTABLE
(
COL1 VARCHAR(1000) AS (value:c1::string),
COL2 VARCHAR(1000) AS (value:c2::string),
COL3 VARCHAR(1000) AS (value:C3::string),
COL4 VARCHAR(1000) AS (value:c4::string),
COL5 VARCHAR(1000) AS (value:c5::string),
COL6 VARCHAR(1000) AS (value:c6::string)
)
WITH
LOCATION=#TESTSTAGE
AUTO_REFRESH = true
FILE_FORMAT = 'FILE_TESTFORMAT_CSV'
PATTERN='.*TEST_DATA.csv'
;
I tried to create an external table from CSV file(which is in cloud storage), I'm getting error in converting Varchar to datetime and Varchar to decimal. Can someone please help me in this whats going wrong
Error messages
HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.
CREATE EXTERNAL TABLE [ext].[Load_History_test]
( [Table_Name] [varchar](100) NULL,
[Loaded_On] [datetime] NULL,
[Transferred_Count] [decimal](30) NULL,
[Transferred_Volume_MB] [decimal](30) NULL,
[Load Duration] [time] NULL,
[Throughput_Records] [decimal](30) NULL,
[Throughput_Volume_KB_sec)] [decimal](30) NULL
)
WITH (DATA_SOURCE = [ADLS_External_Landing],LOCATION = N'/refdata/replicate/load_history',FILE_FORMAT = [CSVFileFormatwithHeaderloadhistory],REJECT_TYPE = VALUE,REJECT_VALUE = 0)
GO
CREATE EXTERNAL FILE FORMAT [CSVFileFormatwithHeaderloadhistory] WITH (FORMAT_TYPE = DELIMITEDTEXT, FORMAT_OPTIONS (FIELD_TERMINATOR = N',',
STRING_DELIMITER = N'0x22',
DATE_FORMAT = N'dd/MM/yyyy HH:mm:ss',
FIRST_ROW = 2,
USE_TYPE_DEFAULT = FALSE))
GO
There is at least one date not in d/m/y format, one decimal too large, or one or more rows have too many or too few commas causing invalid formats.
Note that the row(s) causing the problem might not conveniently be among the three rows captured in your screenshot.
I am using GoLang in the back-end and PostgreSQL as a database. I'm new to PostgreSQL database connections with Go. I'm using Beego as a back-end. I want to create a table with one of a fields of JSON type using Golang database/sql package and lib/pq. That what I do
This is my create table query
createtable:= `CREATE TABLE apply_leave1 (
leaveid serial PRIMARY KEY NOT NULL ,
empid varchar(10) NOT NULL ,
leavedays double precision NOT NULL DEFAULT 0 ,
mdays double precision NOT NULL DEFAULT 0 ,
leavetype varchar(20) NOT NULL DEFAULT '' ,
daytype text NOT NULL '',
leavefrom timestamp with time zone NOT NULL,
leaveto timestamp with time zone NOT NULL,
applieddate timestamp with time zone NOT NULL,
leavestatus varchar(15) NOT NULL DEFAULT '' ,
resultdate timestamp with time zone,
certificatestatus bool NOT NULL DEFAULT FALSE
certificate json NULL)`
conn := fmt.Sprintf(
"user=%s password=%s dbname=%s sslmode=disable",
"postgres",
"root",
"employee")
log.Println("Creating a new connection: %v", conn)
db, err := sql.Open("postgres", conn)
stmt, err1 := db.Prepare(createtable)
defer stmt.Close()
_, err = stmt.Exec()
if err != nil {
fmt.Println(err.Error()
}
}
This is throwing me the following error
Handler crashed with error runtime error: invalid memory address or nil pointer dereference
But when I'm using query to select something from the table there is no problem with table creation query or other executed sql code. I appreciate any help. Thanks!
I think that your problem is the invalid sql used to create the table, it should be:
CREATE TABLE apply_leave1 (
leaveid serial PRIMARY KEY NOT NULL ,
empid varchar(10) NOT NULL ,
leavedays double precision NOT NULL DEFAULT 0 ,
mdays double precision NOT NULL DEFAULT 0 ,
leavetype varchar(20) NOT NULL DEFAULT '' ,
daytype text NOT NULL DEFAULT '',
leavefrom timestamp with time zone NOT NULL,
leaveto timestamp with time zone NOT NULL,
applieddate timestamp with time zone NOT NULL,
leavestatus varchar(15) NOT NULL DEFAULT '' ,
resultdate timestamp with time zone,
certificatestatus bool NOT NULL DEFAULT FALSE
certificate json NULL)
You miss the DEFAULT for daytype column. Also, you're not catching correctly the errors, because the func leave() can be executed totally before send a response. You can use a return when you find an error.
db, err := sql.Open("postgres", conn)
if err != nil {
fmt.Println(err.Error())
e.Data["json"] = err.Error()
e.ServeJSON()
return
}
Also, you're creating a connection and not closing it at the end of the func.
i have table "users" like this
id | firstname | lastname
==========================
1 | Ujang | ahmad
2 | Jajat | sudrajat
and have data :
$record = array('firstname'=>'some value', 'lastname'=>'some value');
$table = "users";
and process update like this :
$exc= $conn->AutoExecute($table, $record, 'UPDATE', 'id = 1');
how do I update field firstname with the value of lastname use AutoExecute
so I get a result like this :
id | firstname | lastname
==========================
1 | ahmad | Ujang
2 | sudrajat | Jajat
Unless I misunderstand you, AutoExecute doesn't seem right for the job. If you need to do a one-time conversion of all records in your table I would just rename the columns.
ALTER TABLE `users`
CHANGE COLUMN `lastname` `firstname` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL AFTER `id`,
CHANGE COLUMN `firstname` `lastname` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL AFTER `firstname`;
Or in PHP/ADODB:
$sql = "ALTER TABLE `users`
CHANGE COLUMN `lastname` `firstname` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL AFTER `id`,
CHANGE COLUMN `firstname` `lastname` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL AFTER `firstname`;";
if (($result = $conn->Execute($sql)) === false)
exit($sql.'<br />'.$conn->errorMsg());
If you need to target specific records you could use a temporary variable.
$sql = "UPDATE users
SET firstname=(#temp:=firstname), firstname = lastname, lastname = #temp
WHERE id=2";
if (($result = $conn->Execute($sql)) === false)
exit($sql.'<br />'.$conn->errorMsg());
Cheers