I'm new to postgresql. We have weekly datadumps which are stored with a .dump extension. I want to import one of these in order to interact with the data locally on my machine. I tried with pgAdmin but that apparently only imports .json files, and then i tried with a free trial of tableplus, but I still can't figure it out there. Does anyone know what approach I should use to achieve this?
Many thanks!
Related
I've done a database using Oracle SQL Developer for a University exam. Now, I've to send it to my Professor by email.
It's possible to export the database completely as one directory, containing all files I've created, in order to allow my Professor to import it into SQL Developer and test it?
I've tried using "Tools">"Database Export" but I've obtained only a lot of separate files.
Wrong, I'm afraid. You've (tried to) export just one schema, not entire database.
Anyway: if you got bunch of files, then it is because you instructed the tool to do so. Next time choose single file:
I have hundreds of .dbf files separated into different folders, basically organized as a bunch of different databases with the same structure. Some extend into the millions of rows. I need that data to end up in a SQL Server database (or multiple databases?), so my team can work with it more easily.
I can't do it manually via Access or the Import/Export Wizard as most tutorials suggest, because there are too many files. I have Access 2016, SSMS 17, Visual Studio 2017, and Windows 10 64 bit. I have been able to open individual tables in Access through the dBase V setting.
I don't know where to start, because I'm pretty new to everything. For example, should I write a console app, configure some SSMS setting, or do something else I'm unaware of?
Could you outline a high level step-by-step process I should use, and maybe point me to some resources? I've looked at a bunch of docs and forums through Google, but none quite seem to make sense to me. The most promising is this post, but I don't have the provider listed, like several others in the comments.
You may consider Integration Services for this kind of import job
https://learn.microsoft.com/en-us/sql/integration-services/connection-manager/integration-services-ssis-connections
And here is how to Connect to a dBASE or Other DBF File
https://learn.microsoft.com/en-us/sql/integration-services/connection-manager/connect-to-a-dbase-or-other-dbf-file
If this is one time operation you can convert dbf file to sql script and import data into sql server . There are some applications on windows store capable of doing so.
https://www.microsoft.com/en-us/store/search/apps?q=dbf+to+sql
SSMA for Access to Console (AccessToSQL)
I created a project with SSMA for Access and successfully imported an Access file to the SQLServernow, I want to automatize the task with SSMA for Access Command Prompt Also I found the documentation to be quite information lacking.
Edit:
With the provided sample XML and some minor tweaks I'm able to migrate the tables, I think my problem is solved.
I think it's because it's trying to migrate the access tabes and queries there is any way to only migrate the tables?
(I should add that migrating only the tables with the SSMA(GUI) works just fine.)
SSMS import Data
I also tried to import the data using the Server Management Studio, but since it's an express edition I'm unable to save the package as described in this process.
Affter fiddling arround for two days I finally managed to make SSMA for Access Command Prompt work by editing the sample script to fit my needs, you can see it here.
Now I'll schedule the task with the Windows built-in scheduler and that should do the trick.
I have some questions about importing data from Excel/CSV File into SQL Server. Let me first explain the overall scenario.
We receive data from multiple sources and in either Excel/CSV format. This data is to be imported into SQ Server into a table. Because we receive the data from multiple sources we have a requirement to map the columns in the Excel files to the columns in our SQL Server table.
I understand that either DTS or the Import / Export wizard is the way to import this data if we were to do this import manually. But I have the following questions
Are there alternatives available to DTS/Import export wizard?
If I were to write an application for importing data what are the .net framework classes that I would or could use? For some reason I don't want to use or build a SQL script within the application. What would be the best way of going about doing this?
Is there any way we can reduce the effort involved in mapping data?
Any suggestions, help would be most welcome
Regards
Romi
Are there alternatives available to DTS/Import export wizard?
-- bulkinsert.
If I were to write an application for importing data what are the .net framework classes that I would or could use? For some reason I dont want to use or build an SQL Script within the application. What would be the best way of going about doing this?
-- SSIS.
Is there any way we can reduce the effort involved in mapping data?
-- ?
SSIS is a very powerful tool. May want to explore that option first. You can even build custom component using .net as well.
I want to import data into SQL Server Express, from Access, Excel and txt files. I'm creating a decent database, and I must to import these old formated data. When working with few records, I copy and paste directly through Visual Web Developer DB Explorer.
But now I'm dealing with a few more records (40k). I think copy/paste unsafe, slow and unprofessional. I haven't any other interfaces to control SQL server. How can I do that?
Thanks!
There is an "Import and Export Wizard" that comes with SQL Express. It allows you to import from Access, Excel, ODBC, SQL Client etc.
I don't think there's a clear answer but I really think MSACCESS 2000 or higher is a very versatile tool for doing this..
Linking in tables and using Append queries to other linked tables works really well, plus utilizing the power of VBA helps in some cases too (like calling a vba function from query designer (like InStr or Mid etc..) (if your familiar with this)
Does anyone else agree?
The BCP (Bulk Copy) works well for importing into SQL Server: http://msdn.microsoft.com/en-us/library/ms162802.aspx
There is also the "bulk insert" command: http://msdn.microsoft.com/en-us/library/ms188365.aspx which has the caveat that the file must be physically accessible from the server.
Both of these methods can import comma delimited files, so you'd need to be able to create those from your data source.
I recommend loading all the objects from one SQL table into a JSON object and then indexing through an array of object and translating them into the new table. I have some open source MySQL to JavaScript bridge code that can help with this if you need.
In case you have not found a solution to this yet, try http://www.razorsql.com/download_win.html
I am not affiliated with them, but I was looking for this same solution and this is working.