i created a external table when i select from it this error show.
i work with oracle 19c
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file customer.csv in EXTERNAL not found
------------code----------------
CREATE TABLE customers
(Email VARCHAR2(255) NOT NULL,
Name VARCHAR2(255) NOT NULL,
Phone VARCHAR2(255) NOT NULL,
Address VARCHAR2(255) NOT NULL)
ORGANIZATION EXTERNAL(
type oracle_loader
DEFAULT DIRECTORY external
ACCESS PARAMETERS
(
records delimited by newline
fields terminated by ','
missing field values are null
REJECT ROWS WITH ALL NULL FIELDS)
LOCATION ('customer.csv'))
REJECT LIMIT UNLIMITED;
customer.csv data
salma.55#gmm.com,salma,0152275522,44al,
mariam.66#hotmail.com,mariam,011145528,552www,
ahmed.85#gmail.com,ahmed,0111552774,44eee,
"DEFAULT DIRECTORY external" means you are looking in a named directory that you have called "external".
For example, if I had done:
create directory XYZ as '/tmp';
then
default directory XYZ
means I'll be searching in /tmp for my files. So look at DBA_DIRECTORIES to see where your "EXTERNAL" directory is pointing
Related
Create a table statement as follows:
[omm#tpl-centos7 bin]$ ./gsql -ddev -Uomm -p26000 -f test.sql
gsql:test.sql:7: ERROR: column name “tid” conflicts with a system column name
total time: 0 ms
[omm#tpl-centos7 bin]$ cat test.sql
CREATE TABLE if not exists ax_quarantine_rcpt
(
“tid” varchar2(255) NOT NULL,
rcpt varchar2(255) NOT NULL,
org_id varchar2(60) NOT NULL,
PRIMARY KEY (“tid”,rcpt)
);
error:column name “tid” conflicts with a system column name
Looking at the documentation, it says reserved words must never be used as other identifiers, but tid is also not in the keyword
Referring to the Internet with double quotation marks does not work, solve
It may be a version problem. He described that the tid is occupied by the system view field. I can use the latest compiled version. It is recommended to use the latest version.
If you cannot compile it yourself, you can use the container version. https://hub.docker.com/repository/docker/enmotech/opengauss
HI I Have one doubt in ssis,
source Location have different files each file name is comes with location name .here we want load
each file name corresponding tables using ssis package.
source loacation have multiples files for each locationname files;
exaple:Files location : c:\Sourcefile\
Filesnames comes like : hyd files,bang files.
Hyd files comes like: hyd.txt,hyd1.txt hyd2.txt all are same structure only.hyd related all files load into hyd table only.
bang files comes like: bang.txt,bang.txt bang2.txt all are same structure only.bang related all files load into bang table only.
all source files and target tables structure are same only.
source FIles Structure: for hyd.txt file
Id,name,loc
1,abc,hyd
2,hari,hyd
for hyd1.txt file
id,name,loc
4,banu,hyd
5,ran,hyd
similar to bang:
id,name,loc
10,gop,bang
11,union,loc
for bang1.txt file
id,name,loc
14,ja,bang
here all hyd related text files load into hyd table in sql server database table. similar to bang fils load into bang table.
hyd table structure :
CREATE TABLE [dbo].[hyd](
[id] [int] NULL,
[name] [varchar](50) NULL,
[loc] [varchar](50) NULL
)
similar to bang
CREATE TABLE [dbo].[bang](
[id] [int] NULL,
[name] [varchar](50) NULL,
[loc] [varchar](50) NULL
)
I tried like below:
above one tables names not getting dynamically. i kept statistically values in table variable. that time all location related records are loaded into one table.
how to load multiple files into multiple destination table in ssis.please tell me how to achive this task in ssis
From the screenshots i have 3 suggestions:
You have to set the Data Flow Task Delay Validation property to True
You have to change the User::location variable value outside the Data flow task, you can add an expression task before the data flow task with the following expression
#[User::location] = SUBSTRING(#[User::FileName],1,FINDSTRING(#[User::FileName,".",1) -1)
or use a script component to achieve this
Or you can add a script task followed 2 data flow tasks inside the for each loop, the script task check the filename: if it is hyd it execute the first DFT , if it is bang it execute the second: (check this link: Working with Precedence Constraints in SQL Server Integration Services)
I am not sure if my issue connecting to the Scala Play 2.5.x Framework or to PostgreSQL so I am going to describe my setup.
I am using the Play 2.5.6 with Scala and PostgreSQL 9.5.4-2 from the BigSQL Sandboxes. I use the Play Framework default evolution package to manage the DB versions.
I created a new database in BigSQL Sandbox's PGSQL and PGSQL created a default schema called public. I use this schema for development.
I would like to create a table with the following script (1.sql in DB evolution config):
# Initialize the database
# --- !Ups
CREATE TABLE user (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
email TEXT NOT NULL,
creation_date TIMESTAMP NOT NULL
);
# --- !Downs
DROP TABLE user;
Besides that I would like to read the table with a code like this:
val resultSet = statement.executeQuery("SELECT id, name, email FROM public.user WHERE id=" + id.toString)
I got an error if I would like to execute any of the mentioned code or even if I use the CREATE TABLE... code in pgadmin. The issue is with the user table name. If I prefix it with public (i.e. public.user) everything works fine.
My questions are:
Is it normal to prefix the table name with the schema name every time? It seems to odd to me.
How can I make the public schema a default option so I do not have to qualify the table name? (e.g. CREATE TABLE user (...); will not throw an error)
I tried the following:
I set the search_path for my user: ALTER USER my_user SET search_path to public;
I set the search_path for my database: ALTER database "my_database" SET search_path TO my_schema;
search_path correctly shows this: "$user",public
I got the following errors:
In Play: p.a.d.e.DefaultEvolutionsApi - ERROR: syntax error at or near "user"
In pgadmin:
ERROR: syntax error at or near "user"
LINE 1: CREATE TABLE user (
********** Error **********
ERROR: syntax error at or near "user"
SQL state: 42601
Character: 14
This has nothing to do with the default schema. user is a reserved word.
You need to use double quotes to be able to create such a table:
CREATE TABLE "user" (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
email TEXT NOT NULL,
creation_date TIMESTAMP NOT NULL
);
But I strongly recommend not doing that. Find a different name that does not require a quoted identifier.
I'm confused by the errors I get when trying to create an in-memory H2 DB for my Spring Boot application. The relevant configuration is
db.url=jdbc:h2:mem:test;MODE=MySQL;DB_CLOSE_DELAY=-1;INIT=runscript from 'classpath:create.sql'
hibernate.hbm2ddl.auto=create
And create.sql:
CREATE TABLE `cities` (
`name` varchar(45) NOT NULL,
PRIMARY KEY (`name`)
) ;
INSERT INTO `cities` VALUES ('JAEN'),('ALBACETE');
But I get the error Caused by: org.h2.jdbc.JdbcSQLException: Table "CITIES" already exists;
Weird is, if I remove the CREATE TABLE statement, I get:
Caused by: org.h2.jdbc.JdbcSQLException: Table "CITIES" not found;
The only thing that works is using DROP TABLE IF EXISTS, but well, I don't think I should need to.
What's going on? What's the proper way of pre-populating static data into an H2 memory DB?
1) Hibernate way: use import.sql file or specify files
spring.jpa.properties.hibernate.hbm2ddl.import_files=file1.sql,file2.sql
http://docs.spring.io/spring-boot/docs/current/reference/html/howto-database-initialization.html
2) Spring Boot: use default schema.sql & data.sql files
or specify files through properties
spring.datasource.schema = file1.sql
spring.datasource.data = file1.sql, file2.sql
http://docs.spring.io/autorepo/docs/spring-boot/1.0.2.RELEASE/reference/html/howto-database-initialization.html
I'm trying to run a simple external table program using oracle 11g on Linux VM. The problem is that I can't query any data from .txt files.
Here's my code:
CONN / as sysdba;
CREATE OR REPLACE DIRECTORY DIR1 AS 'home/oracle/TEMP/X/';
GRANT READ, WRITE ON DIRECTORY DIR1 TO user;
CONN user/password;
CREATE TABLE gerada
(
field1 INT,
field2 Varchar2(20)
)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY DIR1
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ';'
MISSING FIELD VALUES ARE NULL
)
LOCATION ('registros.txt')
)
REJECT LIMIT UNLIMITED;
--Error starts here.
SELECT * FROM gerada;
DROP TABLE gerada;
DROP DIRECTORY DIR1;
Here's the error message:
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
error opening file home/oracle/TEMP/X/GERADA_3375.log
And thats how registros.txt looks like:
1234;hello world;
I've checked my permissions on DIR1 and I do have read/write permissions.
Any ideas?
ORA-29913 and ORA-29400 mean that you're unable to access to directory and/or file.
Looking carefully at the CREATE DIRECTORY command it looks like the path you're using may be mis-formatted. Try putting a forward slash at the start of the path and removing the one at the end of the path when creating the directory - e.g. CREATE OR REPLACE DIRECTORY DIR1 AS '/home/oracle/TEMP/X';.
Share and enjoy.