I am trying to move data from a local SQL Server 2014 database to Bluemix's database as a service. The IBM SQL database console limits uploads to 20 MB. Some of my tables are significantly larger than that.
What would be the best way to move the data?
Thanks
If you install the DB2 client on your local machine, you can connect directly to your SQLDB database and use the IMPORT or LOAD** utilities to load the data.
For example,
load client from /tmp/mssql.cust.csv of del replace into my_table
** With LOAD, you have to use LOAD CLIENT ... in order to load data from your client machine instead of the server.
Related
Is there a way to load a PostgreSQL DB dump file to a remote SQL Server database? Eg. is there a way BCP could do this or anything in the mssql-tools package?
I have a database in a PostgreSQL DBMS that I am periodically creating a dump file for like...
sudo -u postgres pg_dump --format=custom -d mydb > mydb.dump
...and can move it into a NFS or FTP mount which is also mounted on a centralized data movement server (basically I run an Airflow instance here for managing scheduling). From here I can connect to another server hosting a SQL Server instance. I would like to import this dump file into a DB on the SQL Server remotely (ie. running code and scheduling from the data movement server) (normally I write large tables to this SQL Server instance via the bcp utility from mssql-tools, but not sure how to handle this dump file). I have access credentials to the MSSQL DB, but can't install any new applications, code, etc on the server.
Rarely deal w/ DB stuff. Anyone know if this is possible (or if there is a better / more conventional way to do this kind of thing)?
I have connection strings that look like this after I added a few line feeds so it's easy to read:
<connectionStrings>
<add name="DefaultConnection"
connectionString="Data Source=xxx.database.windows.net;
Initial Catalog=database2;
Persist Security Info=True;
User ID=xxx;
Password=yyy"
providerName="System.Data.SqlClient"/>
I have two tables one in a local database and one in database2. The tables are in the dbo schema.
If the table in the remote and local databases are both called USERDATA then how can I move data
from my local to remote database. I assume I need to make a remote connection but is that possible
if the database is store in the cloud like this? If possible can someone tell me how I can set up
this remote connection? I have SQL Server Management Studio so I can open a SQL Query window. I am just not sure how to specify the remote connection and if that is the best way to do it.
You can use like this:
INSERT INTO [database2].[dbo].[USERDATA]
SELECT tn.ID, tn.NAME
FROM [database].[dbo].[USERDATA] as tn
or you can use SSDT (Sql Server Data Tools) to migrate your schema and data to Sql Database on Azure.
There are multiple ways of handling this. Two simple ones of the top of my head are:
1) If you just need to transfer data for ONE table your best bet is to use a program called 'bcp' that comes with your SQL Server tools.
bcp allows you to export or import data (using in or out parameters) to a file and very quickly. You can simply use the bcp tool, export a table into a flat file, copy it over to the SQL Server in the cloud (using remote deskop or sharing the file in Azure storage or any other web-based storage) and then import the table again.
2) Alternatively, if you have access to SQL server in the cloud from your premise machine (ie: the firewall is open) and you're to run SSIS, you can connect both of the SQL servers within your SSIS package and transfer the data via SSIS
I am using SQL CE 3.5 for one of my project, the front end is WPF application which process the given files and dumps the data to SQL CE database.
Presently application and DB is on same machine.
Client wants that he should be able to run the application from any machine on the network but database should remain on shared location of Server.
User will select the path to SDF file in the Application and then when any file is processed application will dump the data to database.
My question is, if keep SDF file on network shared location and access it from any machine then does it will work fine or could give problem?
actually, it is not possible - SQL CE does not support network-hosted operations: everything related to the sdf-file (temp data) is recorded to the local machine, not network source, thus server is unable to process requests correctly.
You can use SQL Express Server that acts like data storage, for the local client the only thing you need is Microsoft Synchronization library 2.1 (it also works with SQL CE 4.0 SP1)
Summarising, create the SQL CE database, fill it with tables, index them, then port it to SQL Server express, add sync module to your app (in a separate thread ofc) and that's it.
Another solution is to use MS Access DB, which allows such mess, but it is incredibly slow not to mention no way to allow simultaneous writing to the db.
I am in the process of moving all our SharePoint DB's from a SQL 2005 server to a new 2008 server, and after moving the config database, everything seems ok, except when I click on "Timer Job Status" (under Central Admin > Operations > Global Configuration) I receive a "Unable to connect to database. Check database connection information and make sure the database server is running." error.
I get the following entries in the log regarding this:
12/03/2010
13:51:41.80
w3wp.exe
(0x09E0)
0x09AC
Windows SharePoint Services
General
8e2r
Medium
Possible mismatch
between the reported error with code =
0x8107053b and message: "Unable to
connect to database. Check database
connection information and make sure
the database server is running." and
the returned error with code
0x81020024.
12/03/2010
13:51:45.61
OWSTIMER.EXE
(0x0744)
0x0DD8
Windows SharePoint Services
Database
6f8e
Critical
SQL Database
'SP_Test_Config' on SQL Server
instance 'test-server' not found.
Additional error information from SQL
Server is included below. Cannot open
database "SP_Test_Config" requested by
the login. The login failed.
It should be noted that in order to ensure that it was no longer using the config database on the old server, I detached the original SP_Test_Config database in SQL Management Studio.
Obviously there are still references to the old SP_Test_Config database on the old 2005 server. How do I remove these references? Or, barring that, how do I move the config database in such a way that no references to the old 2005 server will remain?
Thank you in advance!
Not really an answer, but what we ended up doing (basically start from scratch using SQL Aliases):
First, create the SQL Server alias. This will make it so if you need to move the databases again in the future, you can just migrate all the databases to the new SQL Server, and change your SQL Server alias to point at that server. This should save you a lot of trouble and heartache in the future.
Run SQL Server Client Configuration Utility at: C:\Windows\System 32\cliconfig.exe.
Under the Alias tab, create a SQL Server Alias for the new SQL Server.
Now, recreate the farm.
Run stsadm -o preparetomove on all content DB's Backup all content DB's and copy to new SQL server
Remove all servers from farm using SharePoint Configuration Wizard
Recreate farm using SharePoint Configuration Wizard with the alias of the SQL Server you created above
Recreate all web apps with temp content DB's
Run stsadm -o deletecontentdb on all temp content databases created in step 4
Run stsadm -o addcontentdb using copied production databases as content database
Troubleshoot ad nauseum
I am working on a database application that runs on various independent servers.
Each server runs an Instance of SQL Server 2005 with the same database. We would have a Master Server where that would be the definitive source of information and various "Client" Servers that would be distributed around (with no network connection of any kind). This Client Servers would return from time to time (lets say once a week) to be synchronized with the Master. Simply put the process would be.
1) Update the database on the master server with all the modifications from a client server (taking into account not overwriting changes made by the update process of a different client server [that would update the same master server])
2) Copy an updated version of the master server database to the client server.
Thanks for any help
MS SQL Integration Services may help:
http://www.microsoft.com/sql/technologies/integration/default.mspx
Also check for database replication. Check the Master-Remote part too.