Oracle DB Errors when Trying to import a Dump File - database

Hi I am trying to import a dump file but im getting the same errors every time, here is what commands I have Used So far:
C:\Users\CCT>sqlplus / as sysdba
SQL> create tablespace CCTADMIN datafile ‘D:\OracleDB\CCTADMIN.dbf’ size 2G autoextend on maxsize 5G;
SQL> create user wrosa identified by wrosa1;
SQL> grant connect, resource, dba to wrosa;
SQL> grant create materialized view to wrosa;
This next line didnt actually created my directory so i wen and created my directory manually on Windows.
SQL> create directory CCT_IMPORT as ‘D:\OracleDB \TEMP’;
SQL> grant read, write on directory CCT_IMPORT to wrosa;
D:\OracleDB \TEMP>impdp wrosa/wrosa1 directory=CCT_IMPORT dumpfile=CCTADMIN4.dmp logfile=impdpWROSA.log remap_schema=CCTADMIN:WROSA remap_tablespace=SOE:CCTADMIN
After that i get the following error
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
I Appreciate your time looking at this.
Thanks

Kindly check CCT_IMPORT directory exists or not? If it exists then wrosa user is having privilege of read,write on same directory or not? Check path of directory proper set or not. Without having directory permission you cannot be able to perform datapump.

Related

snowflake: what does copy grants in this statement do

I have a sql statement as below
create or replace view
SOME_DB_NAME.SOME_SCHEMA_NAME.SOME_VIEW_NAME copy grants as select * from
SOME_DB_NAME.SOME_SCHEMA_NAME.SOME_VIEW_NAME;
And currently i done hate the view VIEW_NAME in the schema SOME_SCHEMA_NAME
The above command fails
SQL compilation error: Object 'SOME_DB_NAME.SOME_SCHEMA_NAME.SOME_VIEW_NAME' does not exist or not authorized.
I am not sure what copy grants as select * from SOME_DB_NAME.SOME_SCHEMA_NAME.SOME_VIEW_NAME also
also why its failing 'SOME_DB_NAME.SOME_SCHEMA_NAME.SOME_VIEW_NAME' does not exist because create or replace is what we are using
The error you have received is due to a lack of the required privileges.
However, you could not create it, as the View definition refers to a view being defined.
If you have the required privileges, you will receive an error "View definition refers to view being defined: "

Docker / Oracle Database / Volume Persistence / Create Table space

I am building a Dev Docker environment and I have to set up an Oracle 19c database.
I have been successful... but not at 100%.
Everything is running correctly, I can create a tablespace, a user/schema, create a table, insert data, access via NodeJs the data too until I restart the container.
In all the tutorials, it is shown to mount a volume pointing to /opt/oracle/oradata
volumes:
- ./database/OracleDB/oradata:/opt/oracle/oradata
But the tablespace are created by default in the /opt/oracle/product/19c/dbhome_1/dbs
I tried to add a volume pointing to that directory
volumes:
- ./database/OracleDB/oradata:/opt/oracle/oradata
- ./database/OracleDB/dbs:/opt/oracle/product/19c/dbhome_1/dbs/
But I receive the following error Error response from daemon: path /home/myusr/docker-base/database/OracleDB/dbs is mounted on / but it is not a shared mount.
Anybody has already faced this issue and found a solution?
I continue of course to search to a solution ;)
System Information
Windows 10 Professionnal with WSL2
Docker version 20.10.8, build 3967b7d
Oracle Database 19c
UPDATE 1
Based on Roberto Comments. Unfortunately, it is not working.
UPDATE 2
I tried the following
CREATE TABLESPACE tbs1_test DATAFILE '/opt/oracle/oradata/tbs1_test' SIZE 100 M AUTOEXTEND ON NEXT 100 M MAXSIZE 10 G;
and it as created the file in the desired location
When you don't change the value of db_create_file_dest, Oracle will use it as default destination for datafiles. In your case, when you executed your create tablespace command, the datafile was created in the default location. That is why it does not appear on your desired directory.
1.Connect as sysdba to the database
2.Execute
SQL> alter system set db_create_file_dest = '/opt/oracle/oradata/ORCLCDB' scope=both;
3.As you have a volume already in the above directory, remove the other volume specification, as it is already shared under /
4.Remove the tablespace and create it back again ( if it is empty )
SQL> DROP TABLESPACE tbs1_test including contents and datafiles;
SQL> CREATE TABLESPACE tbs1_test DATAFILE 'tbs1_test' SIZE 100 M AUTOEXTEND ON NEXT 100 M MAXSIZE 10 G;
5.Verify that the datafile now is in the right volume
SQL> select file_id, file_name from dba_data_files where tablespace_name = 'TBS1_TEST' ;
If you want to dig more in how to create specific volumes inside a docker image, check this post in Stackoverflow, it is one of the best IMHO
How to mount host volumes into docker containers in Dockerfile during build

In the tutorial "Tutorial: Bulk Loading from a local file system using copy" what is the difference between my_stage and my_table permissions?

I started to go through the first tutorial for how to load data into Snowflake from a local file.
This is what I have set up so far:
CREATE WAREHOUSE mywh;
CREATE DATABASE Mydb;
Use Database mydb;
CREATE ROLE ANALYST;
grant usage on database mydb to role sysadmin;
grant usage on database mydb to role analyst;
grant usage, create file format, create stage, create table on schema mydb.public to role analyst;
grant operate, usage on warehouse mywh to role analyst;
//tutorial 1 loading data
CREATE FILE FORMAT mycsvformat
TYPE = "CSV"
FIELD_DELIMITER= ','
SKIP_HEADER = 1;
CREATE FILE FORMAT myjsonformat
TYPE="JSON"
STRIP_OUTER_ARRAY = true;
//create stage
CREATE OR REPLACE STAGE my_stage
FILE_FORMAT = mycsvformat;
//Use snowsql for this and make sure that the role, db, and warehouse are seelcted: put file:///data/data.csv #my_stage;
// put file on stage
PUT file://contacts.csv #my
List #~;
list #%mytable;
Then in my active Snowsql when I run:
Put file:///Users/<user>/Documents/data/data.csv #my_table;
I have confirmed I am in the correct role Accountadmin:
002003 (02000): SQL compilation error:
Stage 'MYDB.PUBLIC.MY_TABLE' does not exist or not authorized.
So then I try to create the table in Snowsql and am successful:
create or replace table my_table(id varchar, link varchar, stuff string);
I still run into this error after I run:
Put file:///Users/<>/Documents/data/data.csv #my_table;
002003 (02000): SQL compilation error:
Stage 'MYDB.PUBLIC.MY_TABLE' does not exist or not authorized.
What is the difference between putting a file to a my_table and a my_stage in this scenario? Thanks for your help!
EDIT:
CREATE OR REPLACE TABLE myjsontable(json variant);
COPY INTO myjsontable
FROM #my_stage/random.json.gz
FILE_FORMAT = (TYPE= 'JSON')
ON_ERROR = 'skip_file';
CREATE OR REPLACE TABLE save_copy_errors AS SELECT * FROM TABLE(VALIDATE(myjsontable, JOB_ID=>'enterid'));
SELECT * FROM SAVE_COPY_ERRORS;
//error for random: Error parsing JSON: invalid character outside of a string: '\\'
//no error for generated
SELECT * FROM Myjsontable;
REMOVE #My_stage pattern = '.*.csv.gz';
REMOVE #My_stage pattern = '.*.json.gz';
//yay your are done!
The put command copies the file from your local drive to the stage. You should do the put to the stage, not that table.
put file:///Users/<>/Documents/data/data.csv #my_stage;
The copy command loads it from the stage.
But in document its mention like it gets created by default for every stage
Each table has a Snowflake stage allocated to it by default for storing files. This stage is a convenient option if your files need to be accessible to multiple users and only need to be copied into a single table.
Table stages have the following characteristics and limitations:
Table stages have the same name as the table; e.g. a table named mytable has a stage referenced as #%mytable
in this case without creating stage its should load into default Snowflake stage allocated

Creating database error

I want to create a database from my local computer through a stored procedure it is not allowing to create the database .
I'm getting this error:
Msg 5133, Level 16, State 1, Line 1
Directory lookup for the file "D:\SHA_dat\SHA.dat" failed with the operating system error 2(The system cannot find the file specified.).
Msg 1802, Level 16, State 1, Line 1
CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
And my code is here:
SELECT #cSQL = 'CREATE DATABASE '+#cDBName+' ON ( NAME = '''+#cDBName+'_data'''+',
FILENAME = ' + quotename(#cDbPath)+ ') LOG ON ( NAME = '''+#cDBName+'_Log'',
FILENAME = ' + quotename(#cLogPath)+ ')'
select #cSQL
where #cDBName refers for the Datbase name to create and #cDBPath and #cLogPath refers to netork paths to create the .dat and .log files
Can anyone help me?
You write that #cDBPath and #cLogPath are network paths. This won't work. While the network path is available for you and you are submitting the CREATE DATABASE, the SQL Server service is processing the CREATE DATABASE. The service is running in it's own logon session with it's own user account and it does not have the same drives mapped as you have. So you cannot use a network drive as you do it.
BTW, the same applies when you want to recover from a backup. The backup needs to be available on a local drive so that the SQL Server service can access it.

Oracle 11g External Table error

I'm trying to run a simple external table program using oracle 11g on Linux VM. The problem is that I can't query any data from .txt files.
Here's my code:
CONN / as sysdba;
CREATE OR REPLACE DIRECTORY DIR1 AS 'home/oracle/TEMP/X/';
GRANT READ, WRITE ON DIRECTORY DIR1 TO user;
CONN user/password;
CREATE TABLE gerada
(
field1 INT,
field2 Varchar2(20)
)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY DIR1
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ';'
MISSING FIELD VALUES ARE NULL
)
LOCATION ('registros.txt')
)
REJECT LIMIT UNLIMITED;
--Error starts here.
SELECT * FROM gerada;
DROP TABLE gerada;
DROP DIRECTORY DIR1;
Here's the error message:
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
error opening file home/oracle/TEMP/X/GERADA_3375.log
And thats how registros.txt looks like:
1234;hello world;
I've checked my permissions on DIR1 and I do have read/write permissions.
Any ideas?
ORA-29913 and ORA-29400 mean that you're unable to access to directory and/or file.
Looking carefully at the CREATE DIRECTORY command it looks like the path you're using may be mis-formatted. Try putting a forward slash at the start of the path and removing the one at the end of the path when creating the directory - e.g. CREATE OR REPLACE DIRECTORY DIR1 AS '/home/oracle/TEMP/X';.
Share and enjoy.

Resources