Export postgres table to csv error - database

I am trying to export all my tables of postrgres into individual csv files for that I am using the following function
CREATE OR REPLACE FUNCTION db_to_csv(path text)
RETURNS void AS
$BODY$
declare
tables RECORD;
statement TEXT;
begin
FOR tables IN
SELECT (table_schema || '.' || table_name) AS schema_table
FROM information_schema.tables t INNER JOIN information_schema.schemata s
ON s.schema_name = t.table_schema
WHERE t.table_schema NOT IN ('pg_catalog', 'information_schema', 'configuration')
ORDER BY schema_table
LOOP
statement := 'COPY ' || tables.schema_table || ' TO ''' || path || '/' || tables.schema_table || '.csv' ||''' DELIMITER '';'' CSV HEADER';
EXECUTE statement;
END LOOP;
return;
end;
$BODY$
LANGUAGE plpgsql VOLATILE
COST 100;
ALTER FUNCTION db_to_csv(text)
OWNER TO postgres;
but when I am calling this function I am getting could not open file "/home/user/Documents/public.tablename.csv" for writing: Permission denied
I have tried copying individual table using
COPY activities TO '/home/user/Documents/foldername/conversions/tablename.csv' DELIMITER ',' CSV HEADER;
It gives me the following error
ERROR: could not open file "/home/user/Documents/foldername/conversions/tablename.csv" for writing: Permission denied
********** Error **********
ERROR: could not open file "/home/user/Documents/foldername/conversions/tablename.csv" for writing: Permission denied
SQL state: 42501
Any suggestions how to fix this.

Make a folder on which every user has access. Then run the COPY command on a file there. COPY works only on directories where postgres user has access
sudo mkdir /media/export
sudo chmod 777 /media/export
COPY activities TO '/media/export/activities.csv' DELIMITER ',' CSV HEADER;

I was facing the same issue and I followed the second answer
Make a folder on which every user has access. Then run the COPY command on a file there. COPY works only on directories where postgres user has access
This didn't work for me.
So, I performed copy to /tmp/somename.csv and then copied the file to my actual required location for usage.
\copy query TO '/tmp/somename.csv' with csv;

Not working after given permission.
Now I tried to export the same location where greenplum data available i.e greenplum/data, now permission denied problem get resolved.
COPY Table_Name TO '/home/greenplum/data/table_data_export.csv';

Related

Copy the same file into table using COPY command & snowpipe

I coudln't load the samefile into table in snowflake using COPY command/snowpipe.
I am always getting the following result
Copy executed with 0 files processed.
I have re-created the table. Truncated the table. But the copy_history doesn't show any data
select * from table(information_schema.copy_history(table_name=>'mytable', start_time=> dateadd(hours, -10, current_timestamp())));
I have used FORCE = true in COPY Command and COPY command didnt load the same file into Table. I have explicitly mentioned file path in COPY COMMAND
FROM
#STAGE_DEV/myfile/05/28/16/myfile_1.csv
) file_format = (
format_name = STANDARD_CSV_FORMAT Skip_header = 1 FIELD_OPTIONALLY_ENCLOSED_BY = '"' NULL_IF = 'NULL'
)
on_error = continue
Force = True;
Anyone faced similar issue and what would the process to load the same file again using COPY command or SNOWPIPE ? I dont have option to change file name or put the files in different S3 bucket.
ls#stage shows the following files ls#stage
I have reloaded files to S3 bucket and it's working. Thank you guys for all the responses. –

Set file name for unloaded file from Snowflake

I am unloading snowflake data into external AWS S3 stg using the below command,
copy into '#ext_stg/path/file_name'
from schema.table
file_format = (type=csv field_delimiter= '~' compression='gzip' null_if=('','NULL', 'null',' ') field_optionally_enclosed_by= '"' )
OVERWRITE = TRUE
;
I want the unloaded filename to be file_name.csv.gz.
But what I am actually getting from the above code is, file_name_0_3_0.csv.gz
How do i set the desired filename as file_name.csv.gz
setting SINGLE=TRUE MAX_FILE_SIZE=5000000000 gave me the desired output. Thank you #waldente
Looks like you are trying to specify the csv extension twice. Try removing it from the S3 path, because the file extension is already specified in file_format.
copy into '#ext_stg/path/file_name.csv'
from schema.table
file_format = (type=csv field_delimiter= '~' compression='gzip' null_if=('','NULL', 'null',' ') field_optionally_enclosed_by= '"' )
OVERWRITE = TRUE
;

In the tutorial "Tutorial: Bulk Loading from a local file system using copy" what is the difference between my_stage and my_table permissions?

I started to go through the first tutorial for how to load data into Snowflake from a local file.
This is what I have set up so far:
CREATE WAREHOUSE mywh;
CREATE DATABASE Mydb;
Use Database mydb;
CREATE ROLE ANALYST;
grant usage on database mydb to role sysadmin;
grant usage on database mydb to role analyst;
grant usage, create file format, create stage, create table on schema mydb.public to role analyst;
grant operate, usage on warehouse mywh to role analyst;
//tutorial 1 loading data
CREATE FILE FORMAT mycsvformat
TYPE = "CSV"
FIELD_DELIMITER= ','
SKIP_HEADER = 1;
CREATE FILE FORMAT myjsonformat
TYPE="JSON"
STRIP_OUTER_ARRAY = true;
//create stage
CREATE OR REPLACE STAGE my_stage
FILE_FORMAT = mycsvformat;
//Use snowsql for this and make sure that the role, db, and warehouse are seelcted: put file:///data/data.csv #my_stage;
// put file on stage
PUT file://contacts.csv #my
List #~;
list #%mytable;
Then in my active Snowsql when I run:
Put file:///Users/<user>/Documents/data/data.csv #my_table;
I have confirmed I am in the correct role Accountadmin:
002003 (02000): SQL compilation error:
Stage 'MYDB.PUBLIC.MY_TABLE' does not exist or not authorized.
So then I try to create the table in Snowsql and am successful:
create or replace table my_table(id varchar, link varchar, stuff string);
I still run into this error after I run:
Put file:///Users/<>/Documents/data/data.csv #my_table;
002003 (02000): SQL compilation error:
Stage 'MYDB.PUBLIC.MY_TABLE' does not exist or not authorized.
What is the difference between putting a file to a my_table and a my_stage in this scenario? Thanks for your help!
EDIT:
CREATE OR REPLACE TABLE myjsontable(json variant);
COPY INTO myjsontable
FROM #my_stage/random.json.gz
FILE_FORMAT = (TYPE= 'JSON')
ON_ERROR = 'skip_file';
CREATE OR REPLACE TABLE save_copy_errors AS SELECT * FROM TABLE(VALIDATE(myjsontable, JOB_ID=>'enterid'));
SELECT * FROM SAVE_COPY_ERRORS;
//error for random: Error parsing JSON: invalid character outside of a string: '\\'
//no error for generated
SELECT * FROM Myjsontable;
REMOVE #My_stage pattern = '.*.csv.gz';
REMOVE #My_stage pattern = '.*.json.gz';
//yay your are done!
The put command copies the file from your local drive to the stage. You should do the put to the stage, not that table.
put file:///Users/<>/Documents/data/data.csv #my_stage;
The copy command loads it from the stage.
But in document its mention like it gets created by default for every stage
Each table has a Snowflake stage allocated to it by default for storing files. This stage is a convenient option if your files need to be accessible to multiple users and only need to be copied into a single table.
Table stages have the following characteristics and limitations:
Table stages have the same name as the table; e.g. a table named mytable has a stage referenced as #%mytable
in this case without creating stage its should load into default Snowflake stage allocated

I want to export a schema using expdp with the following SQL-Statement -> ORA-39001

I want to export a database Schema with the following SQL-Statement. For the moment I use the SQLDeveloper. In the future I want to use this Statement with Java.
DECLARE
schemaName VARCHAR2(200) := 'Example';
dirName VARCHAR2(200) := '/dir/exampleDir';
dumpFile VARCHAR2(200) := 'TestFile.dmp';
directory VARCHAR(100) := 'EXPORT_DIR_' || schemaName;
handle NUMBER;
status VARCHAR2(20);
BEGIN
EXECUTE IMMEDIATE 'CREATE OR REPLACE DIRECTORY ' || directory || ' AS ''' || dirName || '''';
handle := DBMS_DATAPUMP.OPEN(
operation => 'EXPORT',
job_mode => 'SCHEMA',
job_name => 'TEST_EXPORT_' || schemaName);
DBMS_DATAPUMP.ADD_FILE(handle, dumpFile, directory);
DBMS_DATAPUMP.METADATA_FILTER(
handle => handle,
name => 'SCHEMA_EXPR',
value => 'IN (' || schemaName || ')');
DBMS_DATAPUMP.START_JOB(handle);
DBMS_DATAPUMP.WAIT_FOR_JOB(handle, status);
EXECUTE IMMEDIATE 'DROP DIRECTORY ' || directory;
END;
I get the error massage "ORA-39001"
Can anyone help me please? I dont know, how can I solve this problem. I read any informationen about this error massage. Is the problem the directory or what? I would be very thankful if you can help me. I am sorry about my bad english Knowledge.
Tahnk you very much.
also ''/dir/exampleDir' - I don't think that's a valid Oracle
DIRECTORY name. An oracle Directory is a db object that references a
directory on your file system
that is only an example. I know that the oracle Directory is a db object that references a directory on the file system. But the directory in my sql statement and the directory on the file system are the same.

Oracle 11g External Table error

I'm trying to run a simple external table program using oracle 11g on Linux VM. The problem is that I can't query any data from .txt files.
Here's my code:
CONN / as sysdba;
CREATE OR REPLACE DIRECTORY DIR1 AS 'home/oracle/TEMP/X/';
GRANT READ, WRITE ON DIRECTORY DIR1 TO user;
CONN user/password;
CREATE TABLE gerada
(
field1 INT,
field2 Varchar2(20)
)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY DIR1
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ';'
MISSING FIELD VALUES ARE NULL
)
LOCATION ('registros.txt')
)
REJECT LIMIT UNLIMITED;
--Error starts here.
SELECT * FROM gerada;
DROP TABLE gerada;
DROP DIRECTORY DIR1;
Here's the error message:
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
error opening file home/oracle/TEMP/X/GERADA_3375.log
And thats how registros.txt looks like:
1234;hello world;
I've checked my permissions on DIR1 and I do have read/write permissions.
Any ideas?
ORA-29913 and ORA-29400 mean that you're unable to access to directory and/or file.
Looking carefully at the CREATE DIRECTORY command it looks like the path you're using may be mis-formatted. Try putting a forward slash at the start of the path and removing the one at the end of the path when creating the directory - e.g. CREATE OR REPLACE DIRECTORY DIR1 AS '/home/oracle/TEMP/X';.
Share and enjoy.

Resources