BACPAC backups seem to be stuck - sql-server

I'm trying to export my databases and they seem to be "stuck". I tried to create the first backup but when I went download it, I noticed the file size was 0MB. So I went and tried again and got the error below:
Database export error
Failed to export the database: xxxxx-db.
ErrorCode: 409
ErrorMessage: There is an import or export operation in progress on the database 'xxxxx-db'.
I then tried copying the database into a new database and backing THAT one up. When I checked the database server's "Import/Export History", it shows the following.
It looks like the first backup got stuck at 90% and now it won't even try the second one. And in order to file a support ticket with Microsoft it wants me to pay for a monthly plan. Anyone have any ideas?

When the export operation exceeds 20 hours there is a good chance the export has been cancelled. In this case, scale up the tier of the database before beginning the export. Make sure there is no read/write activity on the database. Make sure all large tables have clustered indexes.
Are you trying to export a database of 200 GB or greater? If that is the case export to local storage.
You may have better experience exporting to premium storage.
Sometimes many users exporting databases on the same region may cause some export operations to take more time. Take a look at this Microsoft Support article.
A workaround that may work in your case, use SQL Server Management Studio (SSMS) to export the database instead of the portal. If you were using SSMS then switch to the portal to perform the export.
1.Open SQL Server Management Studio
Connect to the Azure SQL Server
Right click on the required database, choose "Tasks" and "Export Data-tier Application"

Related

dacpac file Publish error to LocalDB: "The element cannot be deployed. This element contains state that cannot be recreated in the target database."

Pretty simple you would think, but as I cannot edit the schema in this database I have no idea how to get past this error when I am publishing my dacpac file to my local database. I am trying to take a copy of a database that is hosted in Azure and have it locally for my own development purposes. I am not a sysadmin of the database, but I have complete access to it other than that. It is a production database so I can't mess anything up for obvious reasons.
I had a hell of a time even getting this dacpac file created in the first place. I was getting far more errors/warnings when trying to export as a bacpac file with data (which is what I really want to do, but I can worry about that later).
Here is the command I am trying:
SqlPackage.exe /Action:Publish /SourceFile:" C:\Data\opkCore.dacpac" /TargetConnectionString:"Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=opkCore; Integrated Security=true;"
This is what I used to create the dacpac file:
sqlpackage.exe /Action:Export /ssn:tcp:<MyDatabase>.database.windows.net,1433 /sdn:opkCore /su:<MyUserName> /sp:<MyPassword> /tf:C:\Data\opkCore.bacpac
I have tried other solutions such as:
Export Data-tier Application, but I am limited to only doing it in an Azure container and I am not in control of that. It is a Pay-As-You-Go model which does not support blob storage apparently
Copy Database only works for 2005 and earlier and this is SQL 2019
Deploy Database to Microsoft SQL Server Azure SQL Database
Import Data-tier Application, same problem as #1
Exporting bacpac file using SqlPackage.exe - Errors all over the place that I cannot fix. The database is not mine to mess up
I CAN export tables one at a time, but then I am missing certain bits of schema that work together, so I get errors there also.
I really should be able to just get a local copy of the database in the EXACT same state that it currently is on our production server. Any other ideas for me on how I can do this that will ignore problems with the database and just get me a local copy the EXACT way the database is in production? 3rd party tools that do this or anything?
I decided to just script all the tables and run the script on my new DB. There were a lot of errors, but it did what it could which was 99.99% of the database schema and that is good enough for my purpose. Maybe I will try to get the data exported and imported as well.
EDIT: To export the data I just used SSMS Export data and the destination used was the new LocalDB database I just created from the scripts.

How to log/trace data export in SQL Server 2012

Is it possible to log/trace data exports in SQL Server?
I'm trying to demo that our environment is safe and everything is logged. But when I try to trace (standard template) the data export it doesn't really show in the trace. I can see the query but nothing really shows that the data was exported.
We don't have a dedicated DBA yet so this kind of fell on me.

Transferring a SQL Database on AWS

We have a SQL Server 2005 database on our local server.
Here it is (ofc i've to repeat the proceder for the other databases):
I've to transfer it to our SQL server 2012 instance on Amazon RDS.
I right clicked the database and selected Generate Scripts - All tables - Copy Schema and Data and saved everything as a sql file
At this point I attempted to use the SQL Azure MW v5.15 (in a question here I saw that it works with AWS too, way to go Microsoft!) to transfer the database on AWS.
However it crashes.
No problem, I try to use SQL Management studio to import the file but as soon the RAM consumed by the program reaches 1gb (as you can see that DB is 3,4gb) BOOM - out of memory error!
What should I do now?
You'll need to do part-by-part of your creation. I'd faced that problem some time before, my scripts reaches like 4 GB, only with the schemas, tables, etc. So, I think you should first of all, generate your scripts of creating schemas, users and logins. After that, tables, views and procedures. Then, another objects, like jobs, functions... To conclude, all the data you have, you should export to the RDS through the IMPORT/EXPORT Wizard in SSMS.
I've followed that steps and it worked for me.
Good luck!

Tool to copy SQL Server 2008 db to SQL Server 2008 Express?

I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]
Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.
If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm
Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.
SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.

How should I go about transferring data from an ODBC app to SQL on an hourly basis?

I'm trying to pull data from an ODBC app to SQL2005(dev ed) DB on an hourly basis. When I run SSIS the option to import all tables and views is grayed out and forces your to write a query. How would I go about setting up a SSIS integration service to update ALL 250 some tables on an hourly basis.
What kind of database is your ODBC data source pointing to? SSIS might not give you a GUI for selecting tables/views for all DB types.
perhaps you could rephrase your question a little, I am not 100% sure what you are asking here. Are you trying to get data into SQL Server from an application via SSIS with the Data Transform task using an ODBC connection to the Application?
Anyhoo, the simple answer to the MS Access part of your question is "hell no" MS Access is never, ever the answer to anything ;-)
I would be inclined to find out why the tables and views are greyed out and fix that issue. (not enough info in this question to determine why they are greyed out)
You might be better off using the Import and Export Wizard. Go into SQL Server Management Studio, right click on the Database you want to import the data into, and select Tasks -> Import Data. It will launch the wizard which will walk you through defining the import process.
At the end of the wizard you can choose to execute the import, and even save it as an SSIS package which you can tweak later.

Resources