We have a dump file that we want to import to an Amazon rds server.
This is what I did:
Create a public db link and verify it works:
create public database link rdsdblink
connect to dbuser identified by dbpsw
using '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST='xxx')(PORT=1521)))(CONNECT_DATA=(SID=dbsid)))';
SQL> select * from dual # rdsdblink;
D
-
X
Create a directory for the dump file:
CREATE OR REPLACE DIRECTORY DATA_PUMP_DIR AS 'G:\DB';
Import the dump file:
impdp dbuser/dbpsw#rdsdblink tablespaces=EMP directory=DATA_PUMP_DIR dumpfile=EMP_dump.DMP logfile=EMP_dump.log network_link=rdsdblink
I have also added rdsdblink connection string to tnsnames.ora file and restarted oracle service ("shutdown immediate", then "startup").
The following error occured:
Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
ORA-39001: invalid argument value
ORA-39200: Link name "rdsdblink" is invalid.
ORA-02019: connection description for remote database not found
My local oracle version:
Oracle Database 11g Express Edition Release 11.2.0.2.0 - Production
Remote oracle version:
Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
You've connected to the remote database (via dbuser/dbpsw#rdsdblink), but your DB link is created in your local database. At the moment you're trying to run the import on the remote DB, with a network link also to the remote DB, and that network link is trying to use a DB link that doesn't exist on that remote DB.
The tnsnames.ora entry and the DB link are completely separate things.
You need to connect normally locally - using whichever credentials you used to create the DB link, probably. The network_link parameter will then make your local database session, that is started by impdp, act against the remote server; so your local directory can be used.
Except... it doesn't work like that. The remote database identified by the network_link can be used as the source of the import, without a dump file at all; but it can't be the target for an import from a file.
From the impdpdocumentation:
The NETWORK_LINK parameter initiates an import via a database link.
This means that the system to which the impdp client is connected
contacts the source database referenced by the
source_database_link, retrieves data from it, and writes the data directly to the database on the connected instance. There are no dump
files involved.
If you really wanted to go down this route, I think you would need a link from remote to local, and to run the import against the remote (as you are now), but to be pulling directly from your schema - not from a previous export. You'd still need access to a DIRECTORY object on the remote server, as logs etc. would be written there, even if you weren't copying your dump file over. Even with nologfile I believe it will error if you don't specify a directory or don't have permissions on it.
The article you linked to in your previous question said the same thing:
For imports, the NETWORK_LINK parameter also identifies the database
link pointing to the source server. The difference here is the objects
are imported directly from the source into the local server without
being written to a dump file. Although there is no need for a
DUMPFILE parameter, a directory object is still required for the
logs associated with the operation.
Related
I am having challenges in creating a database link from an ORACLE DBCS to an ORACLE ATP.
I am creating a database link from an ORACLE DBCS (PAAS) to an ORACLE ATP (Autonomous Transaction Processing) database. I can't seem to get the proper connection set-up for this. Anyone who has successfully been able to?
My connection to the ORACLE ATP with SQLDeveloper is a zipped Wallet.
CREATE DATABASE LINK TARGET_DB
CONNECT TO admin IDENTIFIED BY "Myp#ssword123!"
USING
'(DESCRIPTION=
(ADDRESS=
(PROTOCOL=tcps)
(HOST=99.99.99.99)
(PORT=1522))
(CONNECT_DATA=
(service_name=eoakbwd540pwkbi_myuseratp_high.atp.oraclecloud.com)))';
-- ip address and service names are fake
When I test the DB link using SQLDeveloper I get the ORA-28788 error code.
0. Setup
You start out with two instances:
DBCS - in my case Enterprise Edition/12.2 with port 1521 opened in the security lists
ATP Instance
Download the wallet zip-file from the ATP instance containing a tnsnames.ora, sqlnet.ora and some wallet files.
Then upload the unzipped files to your DBCS instance.
1. Wallet Configuration
On DBCS: Replace the sqlnet.ora and tnsnames.ora in the $ORACLE_HOME/network/admin folder with the ones from the zip file (might need to merge them if you have existing entries that are still needed).
Replace the WALLET_LOCATION in the sqlnet.ora file with the actual location of your wallet files (specifically the cwallet.sso and ewallet.p12). Make certain the permission are open for the oracle user.
2. Database Link
You have two options for the database link (that I know of). First get the service names (e.g. randomatp_high) from your tnsnames.ora file.
Using the username/password of your ATP admin user in the database link connection command
create database link <DBLinkName> connect to ADMIN identified by "<ATPpassword>" using '<ATPServiceName>';
Create two users with the same username and the same password in DBCS and ATP, connect to DBCS as that user and then:
create database link <DBLinkName> using '<ATPServiceName>';
You might need to use alter session set global_names=false; to help with ORA-02085 saying the database link is connected to a different DB.
3. Test
Test the database link with e.g.:
select banner from v$version#<DBLinkName>;
My question is about the following, it is required to pass a database of Oracle Database 10g Express Edition to 11g. I was given the backup on a pendrive, it is a file with extension .dmp (Dump file).
I installed the 11g Express Edition on a new server but also installed the database that comes with this (XE).
I want to restore the database from the 10g to another unit other than C, which is where the Oracle 11g database is installed. I also want this new database to "replace" the XE (I do not know if it is the correct way to say it).
I have only found adjustments and location changes but only within the same unit.
Any scope would be very useful.
Thank you.
Judging from the comments, it sounds like you have been given a Database Dump file (.dmp) from a database on a pendrive, and you need to figure out how to get that file into a database, correct?
First, let's go over some background. What is a dump file (.dmp)? From Oracle:
The dump file set is made up of one or more disk files that contain
table data, database object metadata, and control information. The
files are written in a proprietary, binary format. During an import
operation, the Data Pump Import utility uses these files to locate
each database object in the dump file set.
At a high level, that .dmp file is a collection of DDL and DML statements that will recreate whatever data and objects that were exported. .dmp files make it easier to transport and move large amounts of data between databases using Data Pump. But what is Data Pump? Again, from Oracle:
Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another.
Basically, Data Pump is a set of utilities (EXPDP & IMPDP) that are used to move data between databases. The .dmp file you have was likely created using EXPDP. You will need to use IMPDP to import that .dmp file into a database.
Here's were it gets interesting - you say that you already have an 11g database, correct? If you want to, you should be able to import the 10g dump file directly into your 11g database without any issues. The reason is that Oracle tends to be backwards compatible, and typically speaking, anything that you do with one version of Oracle will be compatible with the version that immediately succeeds it. Jumping from something like Oracle 8i to 11g won't work, but you can always go from 8i to 9i, from 9i to 10g, and so on.
If you want to import that dump file into your 11g database, here's what you'll need to do:
Create a DBA account, or have an account that has been granted Data Pump privileges explicitly.
Move the .dmp file to the server where your 11g database lives. If you want to make it even easier for yourself, you can move the .dmp file to your database's datapump directory. If you don't know where that is, execute the following query on your database: select * from all_directories where directory_name = 'DATA_PUMP_DIR'; This query will return a directory. You don't have to use this directory, it will just make it easier.
Once you have the dump file in place and you have all of the necessary database and operating system privileges, you are ready to import the dmp file. Open a new command line window, set your Oracle home if it is not already set, and then navigate to the directory where you placed the .dmp file. Your import command will look something like this:
impdp [USERNAME]/[PASSWORD]#[DATABASE] directory=[DIRECTORY] dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Where [USERNAME]/[PASSWORD] are your credentials, [DATABASE] is the name of the database you're importing the dump file into, [DIRECTORY] is whatever directory you placed the dump file in, [FILENAME] is the name of the .dmp file, and [LOGFILE] is whatever name you chose for your log file.
Assuming your database has everything necessary for the .dmp file, the import should begin and you will start seeing status updates that look similar to this:
Starting [USERNAME]."SYS_IMPORT_FULL_01": [USERNAME]/******** directory=DATA_PUMP_DIR dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Note that this is just an example, your results may look different. Assuming all goes well, you will see a message like this at the end:
Job [USERNAME]."SYS_IMPORT_FULL_01" completed
If you don't want to import it into your existing 11g database, you could always spin up a new database and import the .dmp file to that one using these same guidelines.
That should be enough to get you started down the right path, hope this helps and good luck!
P.S. Here is a great FAQ on the Data Pump utilities as well: http://www.orafaq.com/wiki/Import_Export_FAQ
When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!
Given the DAT file and the DDL file for each table in a DB2 database, can I import this data to SQL Server? I have no access to the original server or any copy of a DB2 server so connecting to a live instance isn't an option.
Can I do this without a live instance of DB2 or should I go back to the client and ask for CSV files? Is there a procedure or tool that makes this process smoother? I've tried to find a file-based connection string to use to connect to a set of DB2 files with no luck. I've also tried SwissSQLDB2ToSQLServer and SqlLinesData to see if they have a file-based option built in.
OK, given the comment above, you can't import DB2's container files (DAT, LRG, or anything else) directly. You need a CSV or equivalent. Yes, one way to get this is run the EXPORT utility on a live DB2 database. HTH!
I'm learning ASP.NET MVC 3 with Entity Framework Code First. I'm following a tutorial and I downloaded the corresponding solution for testing on my local machine. Now, something I didn't understand very well is about the automatic creation of the database (if this one didn't exist yet on disk). The very first time I run the application, the database is created for me. That's ok.
Here is the section in Web.config
<add name="BlogContext"
connectionString="Data Source=.\SQL2008;Initial Catalog=CodeFirstMVC.mdf;Integrated Security=SSPI"
providerName="System.Data.SqlClient" />
But I have two problems:
1st. For testing purpose, I deleted the database on disk and run again the solution. I thought that the database would be automatically created but I was wrong: I got the error message below:
{"Unable to open the physical file \"c:\\Program Files\\Microsoft SQL Server\\MSSQL10.SQL2008\\MSSQL\\DATA\\CodeFirstMVC.mdf.mdf\". Operating system error 2: \"2(failed to retrieve text for this error. Reason: 15105)\".\r\nCannot open database \"CodeFirstMVC.mdf\" requested by the login. The login failed.\r\nLogin failed for user 'sa'.\r\nFile activation failure. The physical file name \"c:\\Program Files\\Microsoft SQL Server\\MSSQL10.SQL2008\\MSSQL\\DATA\\CodeFirstMVC.mdf_log.LDF\" may be incorrect."}
I noticed that if I changed the file name in my Web.config then the database is again successfully created. Can you explain me? Why do I have to change the database name to get it running again?
2nd. The database is created in the folder located in C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA. I would like to store my database in the App_Data directory. How can I proceed?
Initial catalog is not path to file. It is the name of database. AttachDbFilename is used to specify the file so your connection string should look like:
Data Source=.\SQL2008;Initial Catalog=CodeFirstMVC;AttachDbFilename=|DataDirectory|CodeFirstMVC.mdf;Integrated Security=SSPI
Where |DataDirectory| instructs SQL server to use local application data directory instead of global SQL Server data directory. Local data directory for web application is App_Data.
Edit:
I just noticed that you are probably using full SQL server instead of SQL server express. As I know creating database in App_Data automatically is feature of SQL server express. That also explains first error because SQL server created database called CodeFirstMVC.mdf and stored the database in its global data directory within CodeFirstMVC.mdf.mdf file and transaction log in CodeFirstMVC.mdf.ldf file. It also registered that database internally. By deleting files you didn't remove database from SQL server. You just break its functionality but SQL server still believes that the database exists. That is also reason why you have to change the name to make it work.