Laravel 5.5 - DB::statement error with \copy command (POSTGRES) - database

Im trying to use the \copy command from POSTGRES using laravel 5.5, to insert a large file at the DB, but im getting this error bellow.
I tried this way:
DB::statement( DB::raw("\\copy requisicoes FROM '".$file1."' WITH DELIMITER ','"));
Get this error:
SQLSTATE[42601]: Syntax error: 7 ERROR: syntax error at or near "\" LINE 1: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_... ^ (SQL: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ',')
Tried this way too:
DB::statement( DB::raw('\copy requisicoes FROM \''.$file1.'\' WITH DELIMITER \',\''));
Get this error:
SQLSTATE[42601]: Syntax error: 7 ERROR: syntax error at or near "\" LINE 1: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_... ^ (SQL: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ',')
If i execute the command that returns on the error above with psql line command, works fine
\copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ','
Could somebody helps me? :)
I have to use \copy insted of copy becouse I dont have superuser privilege at the DB.
https://www.postgresql.org/docs/9.2/static/sql-copy.html
COPY naming a file is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.

See this article on PostgreSQL and note this line:
Do not confuse COPY with the psql instruction \copy. \copy invokes
COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in
a file accessible to the psql client. Thus, file accessibility and
access rights depend on the client rather than the server when \copy
is used.
\copy is a psql instruction, so you do not need to write \copy, just COPY.

This is my code to import data from sql to pgsql database
First export CSV file with separator '^'
Then import same file into pgsql using copy command
$users = User::select('*')->get()->toArray();
$pages = "id,warehouse_id,name,email,email_verified_at,password,remember_token,created_at,updated_at\n";
foreach ($users as $where) {
$pages .= "{$where['id']}^{$where['warehouse_id']}^{$where['name']}^{$where['email']}^{$where['email_verified_at']}^{$where['password']}^{$where['remember_token']}^{$where['created_at']}^{$where['updated_at']}\n";
}
$file = Storage::disk('local')->put('user.csv', $pages);
if($file){
$data = "";
try {
$file_path = storage_path('app/user.csv');
$data = DB::connection('pgsql')->statement("copy public.users (id, warehouse_id, name, email, email_verified_at, password, remember_token, created_at, updated_at) FROM '$file_path' DELIMITER '^' CSV HEADER ENCODING 'UTF8' ESCAPE '\"';");
} catch (\Exception $e) {
throw $e;
}
}

Related

How to solve error "Field delimiter ',' found while expecting record delimiter '\n'" while loading json data to the stage

I am trying to "COPY INTO" command to load data from s3 to the snowflake
Below are the steps I followed to create the stage and loading file from stage to Snowflake
JSON file
{
"Name":"Umesh",
"Desigantion":"Product Manager",
"Location":"United Kingdom"
}
create or replace stage emp_json_stage
url='s3://mybucket/emp.json'
credentials=(aws_key_id='my id' aws_secret_key='my key');
# create the table with variant
CREATE TABLE emp_json_raw (
json_data_raw VARIANT
);
#load data from stage to snowflake
COPY INTO emp_json_raw from #emp_json_stage;
I am getting below error
Field delimiter ',' found while expecting record delimiter '\n' File
'emp.json', line 2, character 18 Row 2, column
"emp_json_raw"["JSON_DATA_RAW":1]
I am using a simple JSON file, and I don't understand this error.
What causes it and how can I solve it?
File format is not specified and is defaulting to CSV format hence the error.
Try this:
COPY INTO emp_json_raw
from #emp_json_stage
file_format=(TYPE=JSON);
There are other options too that can be specified with file_format other than TYPE. Refer the documentation here: https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#type-json
try:
file_format = (type = csv field_optionally_enclosed_by='"')
The default settings do not expect the " wrapping around your data.
So you could strip all the " or ... just set the field_optionally_enclosed_by to a ". This does mean if your data has " in it things get messy.
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#type-csv
Also have a standard practice to mention type of file either it could be CSV, JSON ,AVRO , Parquet etc.
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html

syntax error near go in a script run from powershell

I'm trying to bulk import into SQL Server, and I need to automate the task (there are thousands of directories) in Powershell. I'm using bcp and have a format file because I need to skip a column when importing. Whenever I run this, it fails with the error:
Exception calling "ExecuteReader" with "0" argument(s): "Incorrect syntax near 'GO'.
The code is:
$query =
"USE Database;
GO
BULK INSERT $tableName
FROM 'C:\users\Name\documents\bcp_sql\File\$name\$dir_id${string}.csv'
WITH (FORMATFILE = 'C:\users\Name\documents\bcp_sql\formatFile.fmt');
GO
SELECT * FROM $tableName;
GO"
$sqlCmd2 = $connection.CreateCommand()
$sqlCmd2.Connection = $connection
$sqlCmd2.CommandText = $query
$sqlCmd2.ExecuteReader()
I've confirmed that the file paths do, in fact, exist (by cd-ing to them).

Load CSV file into Database Table Wordpress

As said in my previous question I am having trouble loading data into my table through a csv file. The error I am getting is that the file is not found but it's there. Could this be a WAMP issue i.e. permissions? You can find the code below along with the error.
Appreciate your help as always:
function load_table() {
global $wpdb;
$filename = 'upper_db_.csv';
$table_nme = $wpdb->prefix . "upper_winds";
$sql = "LOAD DATA INFILE '" . $filename . "'
INTO TABLE $table_nme
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
ESCAPED BY '\"'
LINES TERMINATED BY '\n'
";
$wpdb->query($sql);
}
File 'c:\wamp\bin\mysql\mysql5.6.17\data\wp-test\upper_db_.csv' not found (Errcode: 2 - No such file or directory)
P.S. I have tried using backslashes, giving full path and I have also tried using the LOAD DATA LOCAL INFILE but to no avail.
Problem has been solved using the below
$filename = str_replace("//","//////",__DIR__ ."/file.csv");
The plugin has now been completed and can be viewed at http://howtoflyahelicopter.com/aviation-weather-briefing/

Export postgres table to csv error

I am trying to export all my tables of postrgres into individual csv files for that I am using the following function
CREATE OR REPLACE FUNCTION db_to_csv(path text)
RETURNS void AS
$BODY$
declare
tables RECORD;
statement TEXT;
begin
FOR tables IN
SELECT (table_schema || '.' || table_name) AS schema_table
FROM information_schema.tables t INNER JOIN information_schema.schemata s
ON s.schema_name = t.table_schema
WHERE t.table_schema NOT IN ('pg_catalog', 'information_schema', 'configuration')
ORDER BY schema_table
LOOP
statement := 'COPY ' || tables.schema_table || ' TO ''' || path || '/' || tables.schema_table || '.csv' ||''' DELIMITER '';'' CSV HEADER';
EXECUTE statement;
END LOOP;
return;
end;
$BODY$
LANGUAGE plpgsql VOLATILE
COST 100;
ALTER FUNCTION db_to_csv(text)
OWNER TO postgres;
but when I am calling this function I am getting could not open file "/home/user/Documents/public.tablename.csv" for writing: Permission denied
I have tried copying individual table using
COPY activities TO '/home/user/Documents/foldername/conversions/tablename.csv' DELIMITER ',' CSV HEADER;
It gives me the following error
ERROR: could not open file "/home/user/Documents/foldername/conversions/tablename.csv" for writing: Permission denied
********** Error **********
ERROR: could not open file "/home/user/Documents/foldername/conversions/tablename.csv" for writing: Permission denied
SQL state: 42501
Any suggestions how to fix this.
Make a folder on which every user has access. Then run the COPY command on a file there. COPY works only on directories where postgres user has access
sudo mkdir /media/export
sudo chmod 777 /media/export
COPY activities TO '/media/export/activities.csv' DELIMITER ',' CSV HEADER;
I was facing the same issue and I followed the second answer
Make a folder on which every user has access. Then run the COPY command on a file there. COPY works only on directories where postgres user has access
This didn't work for me.
So, I performed copy to /tmp/somename.csv and then copied the file to my actual required location for usage.
\copy query TO '/tmp/somename.csv' with csv;
Not working after given permission.
Now I tried to export the same location where greenplum data available i.e greenplum/data, now permission denied problem get resolved.
COPY Table_Name TO '/home/greenplum/data/table_data_export.csv';

monetdb - error loading tbl

Loading the .tbl file I've got this error:
[nicola#localhost ~]$ mclient -d dbmonet -s "COPY INTO monet.SUPPLIER FROM STDIN USING DELIMITERS ',','\\n','\"'" - < /home/nicola/Scrivania/tabellemonetdb/supplier.tbl
user(nicola):monetdb
password:
missing separator ',' line 0 expecting 6 got 1 fields
failed to import table
current transaction is aborted (please ROLLBACK)
syntax error, unexpected sqlINT in: "0201"
Why do I get this error?
I'm using an ssb schema.
Without knowing anything about the structure of the supplier.tbl file, my guess (from having used SSBM before) would be that it does not use "," as a field separator, but "|".
My SSBM loading command for the supplier table looks like this:
COPY INTO SUPPLIER FROM '/path/to/supplier.tbl' USING DELIMITERS '|', '|\n' LOCKED;

Resources