generate script to create only a single row - sql-server

I was referencing this question How to export all data from table to an insertable sql format?
while looking for a way to create an insert statement for a single row from a table without having to manually write it since the table has many columns. In my case I simply followed the steps listed then performed a ctrl-f search in the resulting script for the record I wanted then copied and pasted that single line to another query window but this would be terrible if I had hundreds of millions of rows. Is there a way to get the same functionality but tell the script generator I only want rows where id = value? Is there a better way to do this using only the out of the box Microsoft tools?

There is no way to do this, but you can do it by using a temp table
Create a new table by inset into and select those records which you want to insert.
Create the script and change the table name by using find and replace.
finally drop that temporary table.

Related

AzureSynapse pipeline how to add guid to raw data

I am new to AzureSynapse and am technically a Data Scientist whos doing a Data Engineering task. Please help!
I have some xlsx files containing raw data that I need to import into an SQL database table. The issue is that the raw data does not have a uniqueidentifer column and I need to add one before inserting the data into my SQL database.
I have been able to successfully add all the rows to the table by adding a new column on the Copy Data command and setting it to be #guid(). However, this sets the guid of every row to the same value (not unique for each row).
GUID mapping:
DB Result:
If I do not add this mapping, the pipeline throws an error stating that it cannot import a NULL Id into the column Id. Which makes sense as this column does not accept NULL values.
Is there a way to have AzureSynapse analystics read in a raw xlsx file and then import it into my DB with a unique identifier for each row? If so, how can I accomplish this?
Many many thanks for any support.
Giving dynamic content to a column in this way would generate the same value for entire column.
Instead, you can generate a new guid for each row using a for each activity.
You can retrieve the data from your source excel file using a lookup activity (my source only has name column). Give the output array of lookup activity to for each activity.
#activity('Lookup1').output.value
Inside for each, since you already have a linked service, create a script activity. In this script activity, you can create a query with dynamic content to insert values into the destination table. The following is the query I built using dynamic content.
insert into demo values ('#{guid()}','#{item().name}')
This allows you to iterate through source rows, insert each row individually while generating new guid every time
You can follow the above procedure to build a query to insert each row with unique identifier value. The following is an image where I used copy data to insert first 2 rows (same as yours) and inserted the next 2 rows using the above procedure.
NOTE: I have taken Azure SQL database for demo, but that does not affect the procedure.

Updating Column Name of a Table together with its dependent UDFs/USPs

I don't know if this already has an answer as I wasn't able to find it, but if so please provide the link.
Column names are used in User Defined Stored Procedures and Functions when accessing certain columns in a table. But what if I want to edit the column name in a table and I have called that column name a lot of times in multiple usp's/udf's.
I am currently using SQL Server and obviously it will not let me update a table column since it is called in multiple existing usp's/udf's. Is there a way to update the column name of a table together with the column names in those usp's/udf's?

How to use the pre-copy script from the copy activity to remove records in the sink based on the change tracking table from the source?

I am trying to use change tracking to copy data incrementally from a SQL Server to an Azure SQL Database. I followed the tutorial on Microsoft Azure documentation but I ran into some problems when implementing this for a large number of tables.
In the source part of the copy activity I can use a query that gives me a change table of all the records that are updated, inserted or deleted since the last change tracking version. This table will look something like
PersonID Age Name SYS_CHANGE_OPERATION
---------------------------------------------
1 12 John U
2 15 James U
3 NULL NULL D
4 25 Jane I
with PersonID being the primary key for this table.
The problem is that the copy activity can only append the data to the Azure SQL Database so when a record gets updated it gives an error because of a duplicate primary key. I can deal with this problem by letting the copy activity use a stored procedure that merges the data into the table on the Azure SQL Database, but the problem is that I have a large number of tables.
I would like the pre-copy script to delete the deleted and updated records on the Azure SQL Database, but I can't figure out how to do this. Do I need to create separate stored procedures and corresponding table types for each table that I want to copy or is there a way for the pre-copy script to delete records based on the change tracking table?
You have to use a LookUp activity before the Copy Activity. With that LookUp activity you can query the database so that you get the deleted and updated PersonIDs, preferably all in one field, separated by comma (so its easier to use in the pre-copy script). More information here: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity
Then you can do the following in your pre-copy script:
delete from TableName where PersonID in (#{activity('MyLookUp').output.firstRow.PersonIDs})
This way you will be deleting all the deleted or updated rows before inserting the new ones.
Hope this helped!
In the meanwhile the Azure Data Factory provides the meta-data driven copy task. After going through the dialogue driven setup, a metadata table is created, which has one row for each dataset to be synchronized. I solved this UPSERT problem by adding a stored procedure as well as a table type for each dataset to be synchronized. Then I added the relevant information in the metadata table for each row like this
{
"preCopyScript": null,
"tableOption": "autoCreate",
"storedProcedure": "schemaname.UPSERT_SHOP_SP",
"tableType": "schemaname.TABLE_TYPE_SHOP",
"tableTypeParameterName": "shops"
}
After that you need to adapt the sink properties of the copy task like this (stored procedure, table type, table type parameter name):
#json(item().CopySinkSettings).storedProcedure
#json(item().CopySinkSettings).tableType
#json(item().CopySinkSettings).tableTypeParameterName
If the destination table does not exist, you need to run the whole task once before adding the above variables, because auto-create of tables works only as long as no stored procedure is given in the sink properties.

SQL Server + triggers: How to get information about the inserted/deleted tables

Is there any way to get information about the updated/deleted tables in a trigger, meaning
which columns are in the inserted/deleted tables?
which data types do they have?
The background for this question is: I would like to create a 'generic' trigger which can be used without having need to be adapted for the table in question.
Dummy code:
foreach column in inserted table
get column name and data type and column (e.g. to exclude text columns or columns with a special name)
if the value has changed do some logging into another table
Unfortunately I could not find the needed information so far; I have to admit that I don't know the proper key words which have to be used.
It would even be helpful to get a list of functions (e.g. UPDATE()) which can be used in triggers to query information about the inserted/deleted tables.
The TSQL code should work on MS SQL Server >= 2008.
TIA

Can I edit AutoNumber Column in Access?

I've lost my data in Access base, and I've manage to bring them back but when I copy the values in the table with the AutoNumber Column it increments the numbers.
Is there Any way to change it to int and then bring it back to AutoNumber?
Here is how I managed to do this in Access 2010:
Make a backup of your database. (Just to be safe.)
Right-click on the table in the tables list, and select Export->Excel. Accept all defaults.
Open the table in Excel and make the desired change to the autonumber field.
Open the table and delete all rows
Right-click on table in the tables list, and select Import->Excel
In the options, choose "Append to table" and select the table. Accept defaults for all other options
This might not be a viable solution for a large table. I don't think Excel can handle more than around 65K rows.
Don't copy the data with the user interface, but append it with a query. Because an Autonumber field is just a long integer with a special default value, you can append to it values that already exist. That doesn't work in the UI, but only in SQL.
An Autonumber field has a few other properties that are different from a normal Long Integer field, but in terms of appending data, those are not relevant. One of those properties is that it is not editable once it's populated, and another is that you can have only one in each table.
I've manage to insert the AutoNumber fields by code from c#.
I take all the data I need and just inserted in an empty table.
How are you bringing the data back? It should be possible to append the data from your table and to keep the existing numbers.
It is necessary however, that you paste from an integer field to the autonumber field. You cannot change a field to autonumber from integer once there is data in the field, but you can change to integer from autonumber.
Make backup of your data table. Delete all data form original table and then do compact & repair your database. By doing this, auto number field will be reset at 1. You may now append your data from backup table.
SQL code like
insert into <tablename>
(<column 1>, <column2>, ...)
values
( <value 1>, <value 2>, ...);
will do the trick if you include the autonumber column in your query. It's pretty tedious, but works. You can switch to SQL mode for any old query to enter this text (usually after preparing it in a text editor), or as #Dominic P points out, you can bring up a VBA immediate window and run DoCmd.RunSQL "INSERT INTO ..." which will give you a better editor experience within Access.

Resources