I've been trying to sort through Microsoft's extensive documentation, but, cannot find the answer I'm looking for, hence, posting it here for the experts!
I have a table in a database in MS SQL Server 2016, that I read/write to using MS SSMMS. I would like to export this single table into my Azure storage account for further analysis in the MS Data Science Virtual Machine, but cannot find a way to do this. Any suggestions?
Thanks.
You can also use tools built into the MS Data Science Virtual Machine (DSVM) to first export from SQL Server to CSV. BCP (command line) is one tool.
If you want a graphic tool, use SSMS and use the "import and export" option to save result of your query to CSV file. Then you can copy the CSV file to a Azure storage account using Azcopy (command line) or Azure Storage explorer (graphical) also available on the DSVM. Hope this helps.
Related
I can't really find anything online about how to do this.
There are a few separate, offline, Microsoft Databases existing...
Everyone has begun staging different .accdb files in an Amazon S3 bucket - I'm hoping Snowflake now provides an easy (ish) solution to reading them into the SQL database I'm building.
The short answer is that you can't. Snowflake can import text files in various formats (csv, XML, JSON, etc) but is has no extract capabilities so it can't connect to applications and read data from them: asking it to read a MS Access file is no different from asking it to read an Oracle or SQL Server file.
You probably have 2 options:
Export the data from MS Access to a file format that Snowflake
can ingest
Use an ETL tool that can read from MS Access and
write to S3 as text files (or directly to Snowflake, which is
probably simpler
You should be able to connect to Snowflake in Microsoft Access through an ODBC connection. You first need to install the Snowflake ODBC Driver and configure a DSN.
I have a .GDB database (old one) and the data in it is very important
I need to convert that .gdb database to a SQL Server database - can anyone help me...
Create connections to both source GDB and Destination SQL Server in ArcCatalog. Copy everything from source and paste it into the destination. You won't be able to do it with SQL tools alone.
Lacking ESRI software, for simple cases, my workflow is to use the GDAL C++ API to read the GDB. This requires the GDAL File GDB driver. Then I will use Microsoft.SqlServer.Types to transfer to SQL Server. This involves low-level APIs and you need to understand the spatial types in the respective libraries. It gets complex if you have polygons with rings, for example.
I'm not aware of a tool that will automatically convert between these database types. You'll need to use an application that can read the old database type (Firebase), learn the table design, create a similar table design in SQL Server, and use the application to load the data from Firebase to SQL Server.
Typically, this kind of work is called ETL (Extract/Transform/Load) and is done with migration tools like SQL Server Integration Service (SSIS). SSIS is free with SQL Server, and there are a lot of books available on how to use it - but like learning to develop software, this isn't a small task.
The easiest way to export Esri File Geodatabase FGDB (.gdb) data to MS SQL Server is with ArcGIS for Desktop at the Standard or Advanced level.
You may also want to try exporting to shapefile (SHP) format (an open transitional format) then import to your MS SQL Server. I've seen a tool online that has worked for me called Shape2SQL.
Esri also has an open File Geodatabase API that you can use to write your own too.
I highly recommend FME Workbench for GIS data conversion. It's like SQL Server Integration Services (ETL) but for GIS. Graphical interface, connect data readers with data writes, insert transforms, run them, etc.
I have been attempting to move from a regular SQL Server on a Win2008 Server to the SQL Server on Amazon AWS RDS.
I thought an simple backup and restore would work. Though AWS RDS doesn't seem to have access to a file system so the sql scripts all seem to need a local file system on the source and destination server. I attempted a script following
exec sp_addlinkedserver #server='test.xxxx.us-east-1.rds.amazonaws.com'
-- Verify that the servers were linked (lists linked servers)
exec sp_linkedservers
EXEC ('RESTORE DATABASE [orchard] FROM DISK = ''C:\Temp\orchard.bak'' WITH FILE = 1, NOUNLOAD, STATS = 10')
AT [test.xxxx.us-east-1.rds.amazonaws.com]
Any Suggestions would be helpful.
download the free 'SQL Azure Migration Wizard' from CodePlex -- I did a short blog/screencast about this. Be sure to set the 'TO' setting in the wizard to the AWS DNS name and then use 'SQL Server 2008' and not 'SQL Azure'
The official word I got for AWS support on migration of SQL databases using .bak files is that it is not supported. So no more quick restore from .bak files. They offered the official help for migration of existing databases here:
Official AWS database migration guide
And the also gave me an unofficial wink at the Azure database migration tool. Just use it to generate a script of your schema and or data and execute it against your RDS instance. Its a good tool. You will have to import the .bak into a non-RDS SQL server first to do this.
SQL Azure migration tool
You will probably find that the Data-tier Applications BACPAC format will provide you with the most convenient solution. You can use Export to produce a file that contains both the database schema and data. Import will create a new database that is populated with data based on that file.
In contrast to the Backup and Restore operations, Export and Import do not require access to the database server's file system.
You can work with BACPAC files using SQL Server Management Studio or via the API in .Net, Powershell, MSBuild etc.
Note that there are issues using this method to Export and then Import from and to Amazon RDS. As a new database is created on RDS, the following two objects are created within it.
A User with membership in the db_owner role.
The rds_deny_backups_trigger Trigger
During the import, there will be a conflict between the objects included in the BACPAC file and the ones that are added automatically by RDS. These objects are both present in the BACPAC file and automatically created by RDS as the new database is created.
If you have a non-RDS instance of SQL Server handy, then you can Import the BACPAC to that instance, drop the objects above and then export the database to create a new BACPAC file. This one will not have any conflicts when you restore it to an RDS instance.
Otherwise, it is possible to work around this issue using the following steps.
Edit the model.xml file within the BACPAC file (BACPACs are just zip files).
Remove elements with the following values in their Type attributes that are related to the objects listed above (those that are automatically added by RDS).
SqlRoleMembership
SqlPermissionStatement
SqlLogin
SqlUser
SqlDatabaseDdlTrigger
Generate a checksum for the modified version of the model.xml file using one of the ComputeHash methods on the SHA256 class.
Use the BitConverter.ToString() method to convert the hash to a hexadecimal string (you will need to remove the separators).
Replace the existing hash in the Checksum element in the origin.xml file (also contained within the BACPAC file) with the new one.
Create a new BACPAC file by zipping the contents of the original with both the model.xml and origin.xml files replaced with the new versions. Do NOT use System.IO.Compression.ZipFile for this purpose as there seems to be some conflict with the zip file that is produced - the data is not included in the import. I used 7Zip without any problems.
Import the new BACPAC file and you should not have any conflicts with the objects that are automatically generated by RDS.
Note: There is another, related problem with importing a BacPac to RDS using SQL Server Management Studio which I explain here.
I wrote up some step-by-step instructions on how to restore a .bak file to RDS using the SQL Azure Migration Tool based on Lynn's screencast. This is a much simpler method than the official instructions, and it worked well for several databases I migrated.
Use the export wizard in sql server management studio on your source database. Right click on the database > tasks > export data. There is a wizard that walks you through sending the whole database to a remote sql server.
There is a tool designed by AWS that will answer most, if not all, of your compatibility questions - the Schema Conversion Tool for SQL Server: https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_Source.SQLServer.html
Because not all sql server database objects are supported by RDS, and even varies across sql server versions, the Assessment report will be well worth your time as well:
https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_AssessmentReport.html
Lastly, definitely leverage Database Migration Service:
https://aws.amazon.com/dms/
The following article discussing how to Copy Database With Data – Generate T-SQL For Inserting Data From One Table to Another Table is what I needed.
http://blog.sqlauthority.com/2009/07/29/sql-server-2008-copy-database-with-data-generate-t-sql-for-inserting-data-from-one-table-to-another-table/
I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]
Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.
If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm
Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.
SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.
Trying to search for a more elegant/overall solution to our common/basic data export tasks
I am convinced that there must be software out there that allows me to:
Define and persist a "setup" (definition of file format, delimiters, encoding, column names etc) from a GUI
Run on a schedule/from command line
Work on both Oracle and MSSql
However, I haven't found it yet... any tips?
what about using groovy to export the data to xml files.
see http://groovy.codehaus.org/Convert+SQL+Result+To+XML
for the gui thing, Oracle's SQL Developer can connect to Oracle, MySQL, SQL Server, MS Access and Sybase
The search ended in using products from http://sqlmanager.net.
They have products that covers the described needs, except that there are separate products for i.e. MSSQL and Oracle