Import table from Windows SQL Server 2012 to hadoop using sqoop - sql-server

I was trying to import a table from RDBMS (Windows - SQL Server 2012) to HDFS using the code below. But I'm getting an error. I could successfully connect it.
sqoop import
--connect
"jdbc:sqlserver://192.1x8.xx.1:14xx;database=AdventureWorks2012;
username=hadox;password=hadxx"
--table Production.Product
--hive-import
I understood the error was caused by the dot (.) in the tables name.
I got that information from the link sqoop to import data to hive. I didn't understand any details in that link.
Can anyone help please?
Thanks in advance.
Error:
ERROR manager.SqlManager: Error executing statement com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'Production.Product'.
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'Production.Product'.

Internally sqoop will consider Production as schemaname (database name) Product as the table name.
if you want to use import the table into production database, product table in hive. I would suggest you to use --query in sqoop command using that you are specifying the sqoop to look for specific table.

Related

PostgreSQL TDS_FDW connection to pull in SQL Server metadata

Creating a metadata repository in PostgreSQL but am having difficulty with TDS_FDW and attempts to import SQL Server's INFORMATION_SCHEMA tables.
Using tds_fdw version 2.0.0-alpha.3
When I try to import "master", TDS_FDW produces a syntax error:
postgres=# IMPORT FOREIGN SCHEMA master FROM SERVER sql002 INTO public OPTIONS (import_default 'true');
ERROR: syntax error at or near ")"
LINE 2: ) SERVER sql002`
If I run a test import schema using dbo, it works fine, but INFORMATION_SCHEMA doesn't exist
What would be a functional way to pull the database and column names from SQL Server into PostgreSQL? (It's been ages since I was a SQL Server dba)

How to import a database (or Schema) using impdp command in Oracle?

I've not worked in Oracle database anytime. Everything I've done in databases is using only MySql. I got a .dpdmp file which I need to import it into Oracle database.
I tried with the example provided in this link, but not a single statement is executed. Totally it completes with 208 error
http://gerardnico.com/wiki/database/oracle/oracle_db_datapump
impdp system/root DIRECTORY=data_dump_dir DUMPFILE=MYDUMPFILE.DPDMP
If I look at the log files, I assume this is the root cause of the problem
Processing object type SCHEMA_EXPORT/USER
ORA-39083: Object type USER:"MYUSER" failed to create with error:
ORA-65096: invalid common user or role name
Since the user creation failed, so every statements that is executed after this also resulted in error. The dump file is created in Oracle 10G where my Oracle is 12c. Is this due to the version conflict?
Please create the user "MYUSER" and then try to import with additional parameters like
remap_talbespace=old_tbsp:new_tbsp
This will import the data into your new tablespace

Import from sql server to hbase, retrieved 395809 records from sql using sqoop, but only 365587 rows in hbase

Import data from sql server to hbase, retrieved 395809 records from sql using sqoop, but only 365587 rows in hbase.
The import command I used is
sqoop import --hbase-table test --hbase-row-key conversation_id --column-family cf1 --columns conversation_id,app_id --connect "jdbc:sqlserver://******;database=*******;username=test;password=*******" --table test -m 1
The command I used to create Hbase table is
create 'test',{NAME=>'cf1',BLOCKCACHE=> false,BLOCKSIZE=>1073741824}
How can I solve this problem?
Thanks,
Mqi
It's difficult to say, but chances are conversation_id is not unique. For more interactive help on this subject, try the sqoop mailing lists.

ogr2ogr to import GML with spatial and non-in SQL Server

I am trying to use ogr2ogr to import GML file into SQL Server Spatial. I successfully import features with geometry, but I have few without geometry column. How can I import all of them?
EDIT:
I reinstalled GDAL, installed latest, works fine, but again cann't write non-spatial features.
Constantly getting error:
ERROR 1: Error creating layer: [Microsoft][ODBC SQL Server Driver][SQL
Server]Incorrect syntax near 'NULLCONSTRAINT'.
The error looks like incorrectly formed SQL statement, created by ogr2ogr against the SQL database.
Have you tried running SQL Server Profiler (within SQL Server) whilst you do the import? Assuming you run a standard trace, you will need to locate the row with NULLCONSTRAINT in the textData column in the trace output. Once you have found the problem statement, this should give you some idea how to fix the problem.
If you need a very simple tutorial on the Profiler this link might help
http://www.mssqltips.com/sqlservertutorial/272/profiler-and-server-side-traces/
A similar error for me was caused by square brackets in the shapefile name I was importing which meant the create table statement was invalid.

Sqoop export from hdfs to SQL Server 2005 using jdts driver fails

I'm trying to export data from hdfs text file to SQL Server using sqoop. When I have more than a couple of rows to insert it throws the following exception:
java.io.IOException: java.sql.SQLException: Incorrect syntax near ','.
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:192)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:567)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:675)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.sql.SQLException: Incorrect syntax near ','.
at net.sourceforge.jtds.jdbc.SQLDiagnostic.addDiagnostic(SQLDiagnostic.java:368)
at net.sourceforge.jtds.jdbc.TdsCore.tdsErrorToken(TdsCore.java:2820)
at net.sourceforge.jtds.jdbc.TdsCore.nextToken(TdsCore.java:2258)
at net.sourceforge.jtds.jdbc.TdsCore.getMoreResul
I've checked the data for inconsistencies and i can't find anything strange.
I was wondering is the driver even supported?
The problem was being caused with the default way sqoop/jtds group multiple insert statements into 1 using comma separated list of values. This approach is not compatible with sqlserver 2005. To get around it I enabled jdbc-batch insert by providing --batch parameter.
The default way sqoop/jtds group multiple insert statements into 1 is supported with sqlserver 2008.
Also, when I tried the same thing (without the --batch argument) using ms-sql driver everything worked fine. Not sure how ms-sql-driver and sqoop work together to ensure compatibility with sql server 2005.

Resources