Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
We have a table with millions of records. We need to archive records older than 3 whats best way of archiving old data of SQL Server database tables?
I guess it depends on the structure of your database, and what you need to do with this archive records. Do they need to be accessible from your application? Do you just need them somewhere so that you can query against it in the future using ad-hoc queries?
Options may include: creating an "Archive Database" where you move the older table records and everything linked to it (foreign key tables), creating an ArchiveTable, or something more complex like Creating Partitioned Tables (Sql Server 2005 +)
More Info:
Partitioning & Archiving tables in SQL Server (Part 1: The basics)
Partitioning & Archiving tables in SQL Server (Part 2: Split, Merge and Switch partitions)
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have a file that contains Client information (client number, client name, client type, etc.) and I imported that into a SQL table.
Client information can change, and when it changes it changes it in the file.
Now, what I want to do is create a SSIS package that will read the file and search for any differences between the file and SQL table and if any changes are picked up, it needs to update the table according to the file (the file will always contain the latest information).
How would I achieve this? Is this possible from an SSIS perspective?
There's different option to achieve it.
Load the file initially in a Stage Table and merge that into production table, it will insert data if it do not match and if it match than you can update the production table accordingly. Get more info on - https://www.mssqltips.com/sqlservertip/1704/using-merge-in-sql-server-to-insert-update-and-delete-at-the-same-time/
Load the data into Stage table than use lookup transformation in SSIS to achieve it. Find the link for lookup transformation - https://www.red-gate.com/simple-talk/sql/ssis/implementing-lookup-logic-in-sql-server-integration-services/
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
My project is a Data Historian System.
which reads data(contains 10,0000 records) every 5 second from sources and inserts into database for reports and analyses.the format of data is simple(iNT, INT, Float, DateTime).
should i have to use OLAP Database Approach?
is SQL Server suitable for this case?
thanks...
That sounds crazy inefficient: there are several alternative approaches you might want to consider:
Use an update trigger to write table inserts / changes to a history
table. You should add the change date to the history table so that
the "effective" record for any particular datetime can be
determined.
In SQL Server, a timestamp column can be used to drive record
version identification, and you can use the same kind of polling
approach you suggested, but saving only new / changed records.
SQL Server has a Change Data Capture to identify changed rows:
details here.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
On adhoc basis, we want to copy contents from 4 of our Oracle production tables to QA/UAT environments.
This is not a direct copy and we need to copy data based on some input criteria for filtering.
Earlier we were using Sybase database hence BCP utility worked with charm there. However, we have recently migrated to Oracle and need similar data copy requirement.
Based on the analyses till now, I have analyzed below options -
RMAN (Recovery Manager) - Cannot use as it does not allow us to copy selected tables or filtering on data.
SQLLDR (SQL Loader) – Cannot use this as we have BLOB columns and hence not sure how to create a CSV file for these BLOBS. Any suggesstions?
Oracle Data pump (Expdp/Imbdp) – Cannot use this as even though it allows copying selected tables it does not allow us to filter data using some query with joins (I know it allows to add query but it works only on single table). A workaround is to create temp tables with desired dataset and dmp them using EXPDP and IMPDP. Any suggesstions if I have missed anything in this approach?
Database Link – This is the best approach which seems possible in this use case. But needs to check if DBA will allow us to create links to/from PRD db.
SQL PLUS COPY - Cannot use this as it does not work with BLOB fields.
Can someone please advise on which should be the best approach w.r.t performance.
I would probably use a DATAPUMP format external table. So it would be something like
create table my_ext_tab
organization external
(
type oracle_datapump
default directory UNLOAD
location( 'my_ext_tab.dmp' )
)
as
<my query>
You can then copy the file across to your other database, create the external table, and then insert into your new table via an insert, something like:
insert /*+ APPEND */ into my_tab
select * from my_ext_tab
You can also use parallelism to read and write the files
Taking all your constraints into account, it looks like Database links is the best option. You can create views for your queries with joins and filters on the PROD environment and select from these views through the db links. That way, the filtering is done before the transfer over the network and not after, on the target side.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
We have 52 MS Access databases and each database has 4 tables. The total data in our databases is around 5 million. Now we are planning to move to SQL Server. We have designed our new database which will be an SQL Server database with approximately 60 tables.
My question is - how will we integrate the 52 Access databases into one SQL Server database?
Is it possible, or we would have to create 52 database in SQL Server too, in order to migrate our data? these 52 databases are interrelated with each other having same structure in access?
If I was you (and I'm not, but if I was...) I would load all of that data into 4 tables. Just append all the data from each Access database into one table. Doctor, Project, Contract, Institution. However, as I'm appending each database, I would add a new field to each table; Country. Then, when you append the data for England to the tables, you also populate the Country field of that table with "England". Etc... with all your countries.
Now, when it comes time to access the data, you can force certain users to only be able to see the data for England, and certain other people to only see the data for Spain, etc... This way, those 4 tables can house all of your data, and you can still filter by any country you like.
From a technical point of view, there's no problem in creating only one SQL Server database, containing all 52 * 4 tables from the MS Access databases. SQL Server provides various options for logically separating your objects, for example by using Schemas. But then again, if you decide to create separate databases, you still have the ability to write queries across databases, even if the databases are not hosted on the same SQL Server instance (although there might be a performance penalty when writing queries across linked servers).
It's difficult to give a more precise answer with the limited detail in your question, but in most cases, a single database with multiple database schema (perhaps 1 schema for each MS access database) would probably be the best solution.
Now, for migrating the data from MS Access to SQL Server, you have various options. If you just need to perform a one-time migration, you could simply use the Import-Export wizard that comes with SQL Server. The wizard automatically creates the tables in the destination database for you, and it also lets you save SSIS-packages that you can use to migrate the data again.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I have two sqlserver database both having same database and tables structure now I want to insert data from one specific table to other database table, both database table having same structure but user for both database is diffrent
I tried with this query but this does not work
insert into Database1.dbo.Audit_Assessment1
select * from Database2.dbo.Audit_Assessment1
Please help me
SQL Server Management Studio's "Import Data" task (right-click on the DB name, then tasks) will do most of this for you. Run it from the database you want to copy the data into.
If the tables don't exist it will create them for you, but you'll probably have to recreate any indexes and such. If the tables do exist, it will append the new data by default but you can adjust that (edit mappings) so it will delete all existing data.
Pulled from https://stackoverflow.com/a/187852/435559
1-You can use Linked server , set up it on view option on top left and select registered server . Then you can open a new query window and write your query.
2-You can use replication.
Snapshot replication if it is just one time or sometimes .
Transactional replication if your insert is repeatedly.
Read more about replication :
http://msdn.microsoft.com/en-us/library/gg163302.aspx
Read more about linked servers :
http://msdn.microsoft.com/en-us/library/aa560998.aspx
Try approaching this differently. Why not script out the table you need and manipulate that way?
From the scripted out insert statement you should be able to easily modify this to go into your new database.
Sounds like your login doesn't have insert permissions on Database2.dbo.Audit_Assessment1. The error about it being an invalid object name is probably because you don't currently have view definition permissions either.