How to import a database (or Schema) using impdp command in Oracle? - database

I've not worked in Oracle database anytime. Everything I've done in databases is using only MySql. I got a .dpdmp file which I need to import it into Oracle database.
I tried with the example provided in this link, but not a single statement is executed. Totally it completes with 208 error
http://gerardnico.com/wiki/database/oracle/oracle_db_datapump
impdp system/root DIRECTORY=data_dump_dir DUMPFILE=MYDUMPFILE.DPDMP
If I look at the log files, I assume this is the root cause of the problem
Processing object type SCHEMA_EXPORT/USER
ORA-39083: Object type USER:"MYUSER" failed to create with error:
ORA-65096: invalid common user or role name
Since the user creation failed, so every statements that is executed after this also resulted in error. The dump file is created in Oracle 10G where my Oracle is 12c. Is this due to the version conflict?

Please create the user "MYUSER" and then try to import with additional parameters like
remap_talbespace=old_tbsp:new_tbsp
This will import the data into your new tablespace

Related

Error when copying CSV files from Windows directory into SQL Server DB by using Apache NiFi

I am trying to copy CSV files from my local directory into a SQL Server database running in my local machine by using Apache NiFi.
I am new to the tool and I have been spending few days googling and building my flow. I managed to connect to source and destination but still I am not able to populate the database since I get the following error: "None of the fields in the record map to the columns defined by the tablename table."
I have been struggling with this for a while and I have not been able to find a solution in the Web. Any hint would be highly appreciated.
Here are further details.
I have built a simple flow using GetFile and PutDatabaseRecord processors 1.
My input is a simple table with 8 columns 2.
My configurations for GetCSV process are here (I have added the input directory and left the rest as default): 3
The configuration for PutDatabaseRecord process is here (I have referred to the CSVReader and DBCPConnectionPool controller services, used the MS SQL 2012+ database type (I have 2019 version), configured INSERT statement type, inserted the schema and correct table name and left everything else as default): 4
The CSVReader configuration looks as shown here (Schema Access Strategy = Use String Fields From Header; CSV Format = Microsoft Excel): 5
And this is the configuration of the DBCPConnectionPool (I have added the correct URL, DB driver class name, driver location, DB user and password): 6
Finally, this is a snapshot of the description of the table I have created in the database to host the content: 7
Many thanks in advance!
The warning "None of the fields in the record map to the columns defined by the tablename table." is also obtained when the processor is not able to find the table and this can happen also when the table name is correctly configured in PutDatabaseRecord but there is some issue with user access rights (which ended up to be the actual cause of my error ...).

Cannot Import DB using the Azure Portal

I am trying to import a DB in the Azure Portal. The original DB I exported from was on a different server, but is setup identically to where I am trying to import. I am importing by going to the target server and clicking the import button. I then choose the storage account, container, and bacpac file I would like to import. I check that the database size and type are the same for the import as the bacpac file. I also double check that the collation is the same on the import as it is in the bacpac. I then confirm. It attempts to do the import for about 20 minutes before giving the error message below. I can see the DB is created when I go to the sql server and click the sql databases blade, but the tables inside the DB are empty.
Could not import package.
Warning SQL72012: The object [data_0] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Warning SQL72012: The object [log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Error SQL72014: .Net SqlClie
I have seen some responses in regards to similar issues, but they all seem to be using SSMS. Does anyone have any ideas on how to fix this issue inside the Azure Portal? Also, does anyone know what check box they are talking about? there is no checkbox when I am doing the import setup.
The warnings you are getting are a bit of a red herring. The issue is with the error you are getting. The line you posted only shows a generic error which should be followed by the actual error after. Try going to the actual database server and check import\export history.
try also importing using powershell which may give you some more details on the error you are getting:
$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write("Importing")
while ($importStatus.Status -eq "InProgress") {
$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write(".")
Start-Sleep -s 10
}
[Console]::WriteLine("")
$importStatus
Without knowing what the error you are getting it's a bit of a stab in the dark trying to guess what the issue is. Giving the version of SQL you are exporting from I can guess this is an on prem database server.
One of the common reasons dacpac files tend to fail when importing is your source database server isn't configured to allow contained databases
If that's the case, you need to go to your source database server (where you are exporting from) and enable that option:
sp_configure 'contained database authentication', 1;
GO
RECONFIGURE;
GO
Once this is run, recreate your dacpac file and try to import that.
As I mentioned, this is a complete stab in the dark as you haven't provided the error you are actually getting

SELECT * INTO another database on a different SQL instance running older SQL Server

I need to copy some tables from a SQL Server 2016 instance to a SQL Server 2008 instance like
select *
into [sql8].[DatabaseA].[dbo].[Customers]
from [DatabaseA].[dbo].[Customers]
but I get an error:
Msg 117, Level 15, State 1, Line 9
The object name 'sql8.DatabaseA.dbo.Customers' contains more than the maximum number of prefixes. The maximum is 2.
I have tried generating a script of the data but my machine runs out of memory during SQLCMD execution from the command line.
Looking for recommendations / pointer.
Thanks
I'm guessing you may need to set up the sql8 server as a linked server from the Server holding the DB you're trying to get the data into. In the image I would be trying to get the data into a db on the MJAYWCO1 server. [sql8] would be the server you want to create a link "to". "[sql8].[DatabaseA].[dbo].[Customers]"
To Do this from the ssms GUI goto Server.ServerObjects.LinkedServers:
Another possibility: Have you tried to import it directly to the new DB? Assuming you can connect to the old database from the new database with creds...
If this doesn't work, you can use the Export Data from the DB you are trying to get the data (under Import data in the second Image) from the "old DB" to create an XML or .CSV file, or whatever might be an applicable format. Use this and the Import Wizard from the "new DB"
Please forgive me if I misunderstood the question as English is my first language and I went to government schools.

SQL Server 2008 R2 Errors running SSIS Package through the Job Agent

I have an SSIS Package that I created through the Import Export (32-bit) Tool. When I executed the package manually through the Execute Package Utility the package run successfully with no issues. However when I try to run the package through a Job Agent in SSMS I keep getting errors. The primary error I get seems to be:
Failed to decrypt protected XML node "DTS:Password" with error
0x8009000B "Key not valid for use in specified state" You may not be
authorized to access this information. This error occurs when there is
a cryptographic error. Verify that the correct key is available.
I'm using SQL Server 2008 R2.
I have researched this error to some degree and I think it has something to do with the package protection level. I feel like I've tried the configurations that make the most sense but none seem to be working for me. The Options are:
Encrypt sensitive data with user key
Do not save sensitive data
Encrypt sensitive data with password
Encrypt all data with user key
Encrypt all data with password
Rely on server storage and roles for access control
I feel like Ishould be using the last option here (Rely on server storage...) because I prefer to use SQL Server Authentication. I use SQL Server Authentication on the 'Choose Destination' window of the SQL Server Import and Export Wizard, and similarly I use this with the same username and password when I create the Job Agent in SSMS on the General Tab of the Job Step Properties. Is it possible that there is something that I need to add to the User I'm using in SSMS - even though it works outside of SSMS?
Something else I wondered that might have an impact is having the option "Drop and Re-create destination table" checked in the Column Mappings window of the Import and Export Tool. I was using a stored procedure to remove the tables before executing the Job Agent and I feel like ti was working at one point - could that have something to do with it?
Again the thing that baffles me most is that it runs no problem when I execute it manually through the 'SQL Server Execute Utility Package' tool.
I've included images of some of the windows I mentioned above if that helps.

What state is my SQL server database in when msdeploy fails on user creation?

I am using msdeploy (version 2) to transfer a database from machine A to machine B.
On in the database on machine A there are some users that do not exist on machine B, thus the transfer (partially) fails with the message:
Error Code: ERROR_SQL_EXECUTION_FAILURE
More Information: An error occurred during execution of the database script.
The error occurred between the following lines of the script: "3" and "5".
The verbose log might have more information about the error.
The command started with the following: "CREATE USER [someDomain\someUser] FOR LOGIN [someDomain"
Windows NT user or group 'someDomain\someUser' not found.
Check the name again. http://go.microsoft.com/fwlink/?LinkId=178587
The database seems to be transfered, except for the user creation. Does anyone know what state the database is in after this failure?
Is there any way I can transfer the database without the users (or better without specific users) using msdeploy?
Web Deploy uses SMO (SQL Management Objects) to script out and apply the scripts for SQL databases, and exposes most of the SMO settings with the dbfullsql provider (so, most of these options: http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.transfer_properties.aspx). If you want to skip the users due to this kind of login-not-exists or user-not-found error, you should be able to do this by adding the scripting option: copyAllUsers=false to the source of the sync. For example:
msdeploy.exe -verb:sync -source:dbfullsql="Data Source=.\SQLExpress;Initial Catalog=MySourceDb;User Id=localUser;Password=LocalPass",copyAllUsers=false -dest:dbfullsql="Data Source=RemoteSQLServer;Initial Catalog=MyDestDb;User Id=remoteUser;Password=RemotePass"
Incidentally, I am surprised you note the db appears to have been sync'd - I would expect this is not actually the case. If you have the permissions for it, Web Deploy will create the database if it did not already exist when it initially tries to make the connection, but your failure occurred very early in the script execution, and I believe Web Deploy dbfullsql syncs are transacted by default (the db creation is separate from the script execution and is not transacted). Thus the db may exist where it did not pre-sync, but I wouldn't expect the data to be present in it.

Resources