I have tried inserting using itertuples but my file is too big. I even split the file in 4 different files even then its too big. one-fourth file takes more than 30 minutes. Is there a easier and quicker way to import data in SQL server?
Thanks in advance.
For faster importing big data, SQL SERVER has a BULK INSERT command. I tested on my local server, importing process for 870 MB CSV file (10 million records) to SQL SERVER executed time been 12.6 second.
BULK INSERT dbo.import_test_data
FROM 'C:\Users\Ramin\Desktop\target_table.csv'
Full syntax this command:
BULK INSERT
{ database_name.schema_name.table_or_view_name | schema_name.table_or_view_name | table_or_view_name }
FROM 'data_file'
[ WITH
(
[ [ , ] BATCHSIZE = batch_size ]
[ [ , ] CHECK_CONSTRAINTS ]
[ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]
[ [ , ] DATAFILETYPE =
{ 'char' | 'native'| 'widechar' | 'widenative' } ]
[ [ , ] DATA_SOURCE = 'data_source_name' ]
[ [ , ] ERRORFILE = 'file_name' ]
[ [ , ] ERRORFILE_DATA_SOURCE = 'errorfile_data_source_name' ]
[ [ , ] FIRSTROW = first_row ]
[ [ , ] FIRE_TRIGGERS ]
[ [ , ] FORMATFILE_DATA_SOURCE = 'data_source_name' ]
[ [ , ] KEEPIDENTITY ]
[ [ , ] KEEPNULLS ]
[ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ]
[ [ , ] LASTROW = last_row ]
[ [ , ] MAXERRORS = max_errors ]
[ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]
[ [ , ] ROWS_PER_BATCH = rows_per_batch ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
[ [ , ] TABLOCK ]
-- input file format options
[ [ , ] FORMAT = 'CSV' ]
[ [ , ] FIELDQUOTE = 'quote_characters']
[ [ , ] FORMATFILE = 'format_file_path' ]
[ [ , ] FIELDTERMINATOR = 'field_terminator' ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
)]
Related
I have a project where I need to check inside which municipalities a certain road is inside. In order to do so I check this using the ST_Intersects function provided by PostGIS.
The problem that I am facing is that I keep on getting errors when using the roads in the querybuilder of typeorm.
My first attempt gave me the following error QueryFailedError: ST_Intersects: Operation on mixed SRID geometries (MultiPolygon, 4326) != (LineString, 0) at new QueryFailedError:
const municipalities = await getConnection()
.getRepository(Municipality)
.createQueryBuilder('municipality')
.where('ST_Intersects(municipality.polygon, :lineString)', { lineString: feature.geometry })
.getQueryAndParameters()
Then I tried to fix this by setting the CRS of the Linestring using ST_SetSRID function that is also provided by PostGIS. But this gives me the following error QueryFailedError: function st_setsrid(unknown, integer) is not unique:
const municipalities = await getConnection()
.getRepository(Municipality)
.createQueryBuilder('municipality')
.where('ST_Intersects(municipality.polygon, ST_SetSRID(:lineString, 4326))', { lineString: feature.geometry })
.getMany()
I have also tried to transfer the Geojson object using ST_GeomFromGeoJSON function in PostGIS. But this gives me the following error: QueryFailedError: quoted object property name expected (at offset 1):
const municipalities = await getConnection()
.getRepository(Municipality)
.createQueryBuilder('municipality')
.where('ST_Intersects(municipality.polygon, ST_GeomFromGeoJSON(:lineString))', { lineString: feature.geometry.coordinates })
.getMany()
I have tried to see if the problem was when inserting the linestring into the query. I did this by using the getQueryAndParameters return function of typeorm:
[
'SELECT "municipality"."uuid" AS "municipality_uuid", "municipality"."name" AS "municipality_name", ST_AsGeoJSON("municipality"."polygon")::json AS "municipality_polygon" FROM "municipality" "municipality" WHERE ST_Intersects("municipality"."polygon", ST_SetSRID($1, 4326))',
[ { type: 'LineString', coordinates: [Array] } ]
A Linestring object in looks as follows:
{
type: 'LineString',
coordinates: [
[ 3.2188848, 51.1980164 ],
[ 3.2190236, 51.1981144 ],
[ 3.2190737, 51.1981991 ],
[ 3.2191065, 51.1982793 ],
[ 3.2191314, 51.1983772 ],
[ 3.2191128, 51.1984885 ],
[ 3.2192215, 51.1985207 ],
[ 3.2192447, 51.1985056 ],
[ 3.219269, 51.1985136 ],
[ 3.2193766, 51.1985571 ],
[ 3.2194258, 51.1985769 ],
[ 3.2194638, 51.198697 ],
[ 3.2195437, 51.1987221 ],
[ 3.2196618, 51.1987591 ],
[ 3.219529, 51.1989397 ],
[ 3.2195909, 51.1990766 ],
[ 3.2196759, 51.1992964 ],
[ 3.2197817, 51.1993408 ],
[ 3.2199103, 51.1992987 ],
[ 3.2204127, 51.1991677 ],
[ 3.2208458, 51.199056 ],
[ 3.2211454, 51.1989993 ],
[ 3.2217751, 51.1988675 ],
[ 3.2219908, 51.1988136 ],
[ 3.2223186, 51.1987693 ],
[ 3.2223696, 51.1987867 ],
[ 3.2223974, 51.1987845 ],
[ 3.2225339, 51.1987696 ],
[ 3.2230863, 51.1987062 ],
[ 3.2233638, 51.198683 ],
[ 3.2234618, 51.1986505 ],
[ 3.223517, 51.1986535 ],
[ 3.2235298, 51.1986528 ],
[ 3.2236432, 51.198651 ],
[ 3.2243337, 51.1986565 ],
[ 3.2244463, 51.198654 ],
[ 3.2245644, 51.1986568 ],
[ 3.2249334, 51.1986506 ],
[ 3.2249647, 51.1986493 ],
[ 3.2251107, 51.19867 ],
[ 3.225378, 51.1986779 ],
[ 3.2257131, 51.1987153 ],
[ 3.225936, 51.1987341 ]
]
}
I have found an answer. ST_GeomFromGeoJSON was indeed the wright function to use in order to inject a GeoJSON object into the query. But I had accidentally used the coordinates instead of the full GeoJSON object.
So the solution is:
const municipalities = await getConnection()
.getRepository(Municipality)
.createQueryBuilder('municipality')
.where('ST_Intersects(municipality.polygon, ST_GeomFromGeoJSON(:lineString))', { lineString: feature.geometry })
.getMany()
I have .csv file to import into table. I am using SQL server 2017,
but here my problem is with the data I have like below:
column1 column2 column3 <br>
"12,23,45,67,89", TestData , 04-12-2009
When I am using field terminator as comma(,), from column1 value is dividing into different column value as field terminator is (,).
What I am looking here, the values which are in double quotes should be escaped from field terminator.
According to https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15, the option FIELDQUOTE = 'quote_characters' should be specified
BULK INSERT
{ database_name.schema_name.table_or_view_name | schema_name.table_or_view_name | table_or_view_name }
FROM 'data_file'
[ WITH
(
[ [ , ] BATCHSIZE = batch_size ]
[ [ , ] CHECK_CONSTRAINTS ]
[ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]
[ [ , ] DATAFILETYPE =
{ 'char' | 'native'| 'widechar' | 'widenative' } ]
[ [ , ] DATA_SOURCE = 'data_source_name' ]
[ [ , ] ERRORFILE = 'file_name' ]
[ [ , ] ERRORFILE_DATA_SOURCE = 'data_source_name' ]
[ [ , ] FIRSTROW = first_row ]
[ [ , ] FIRE_TRIGGERS ]
[ [ , ] FORMATFILE_DATASOURCE = 'data_source_name' ]
[ [ , ] KEEPIDENTITY ]
[ [ , ] KEEPNULLS ]
[ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ]
[ [ , ] LASTROW = last_row ]
[ [ , ] MAXERRORS = max_errors ]
[ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]
[ [ , ] ROWS_PER_BATCH = rows_per_batch ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
[ [ , ] TABLOCK ]
-- input file format options
[ [ , ] FORMAT = 'CSV' ]
[ [ , ] FIELDQUOTE = 'quote_characters']
[ [ , ] FORMATFILE = 'format_file_path' ]
[ [ , ] FIELDTERMINATOR = 'field_terminator' ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
)]
I am trying to reverse an array of coordinates. I have this array:
[
[
9.211615025997162,
44.30419567985762
],
[
9.21164184808731,
44.30412081929745
],
[
9.21165257692337,
44.304053636662175
],
[
9.211606979370117,
44.3038789614507
],
[
9.211572110652924,
44.30381945658962
],
[
9.211539924144745,
44.30378874437967
],
[
9.211507737636566,
44.30376187118276
],
[
9.211564064025879,
44.30377530778277
],
[
9.211574792861937,
44.30374075651943
],
[
9.211574792861937,
44.30371388330059
],
[
9.211574792861937,
44.30369852717
],
[
9.21154797077179,
44.303665895379176
],
[
9.211513102054596,
44.303650539236024
],
[
9.211429953575134,
44.30358911462331
],
[
9.211349487304688,
44.30355840229286
],
[
9.211271703243256,
44.303537287556324
],
[
9.21117514371872,
44.30349505806051
],
[
9.211151003837585,
44.303473943301235
],
[
9.211126863956451,
44.30342979423452
],
[
9.211145639419556,
44.30339908182071
],
[
9.211183190345764,
44.303368369390846
],
[
9.211223423480988,
44.30331846165803
],
[
9.211242198944092,
44.30327623200479
],
[
9.211177825927734,
44.30325895622883
],
[
9.211030304431915,
44.303228243725634
],
[
9.210925698280334,
44.30321096793552
],
[
9.210874736309052,
44.303197531206386
],
[
9.210831820964813,
44.303197531206386
],
[
9.21075403690338,
44.303197531206386
],
[
9.21068161725998,
44.3031994507393
],
[
9.210577011108398,
44.3031994507393
],
[
9.210509955883026,
44.303207128870355
],
[
9.210440218448639,
44.303207128870355
],
[
9.210370481014252,
44.30322632419366
],
[
9.2102712392807,
44.30324743904202
],
[
9.210190773010254,
44.30328199059568
],
[
9.210112988948822,
44.30328391012582
],
[
9.210059344768524,
44.303297346835215
],
[
9.209954738616943,
44.30332422024474
],
[
9.209901094436646,
44.303339576473235
],
[
9.209868907928467,
44.303349174114
],
[
9.209858179092407,
44.30338372560779
],
[
9.209791123867035,
44.30340292087333
],
[
9.209697246551514,
44.30342595518367
],
[
9.209648966789246,
44.3034509090097
],
[
9.209606051445007,
44.30351809233474
],
[
9.209579229354858,
44.30355840229286
],
[
9.209581911563873,
44.3035852755829
],
[
9.209622144699097,
44.30355840229286
],
[
9.20967847108841,
44.30355648277167
],
[
9.209697246551514,
44.30354112659989
]
]
And I would like to reverse the coordinates, which are right now in Long-Lat to Lat-Long thus reversing the whole array or each item in the array would do the job here. I have been stuck trying to make it work, trying things like arrayOfCoords.reversed() and .reverse() to no avail. So how can I do this?
There may be a more elegant/efficient way to do this, but I'd look at using compactMap:
let testArray = [[1.123,2.22], [3.11, 4.0], [5.21, 6.19]]
print(testArray)
[[1.123, 2.22], [3.11, 4.0], [5.21, 6.19]]
let flippedElementArray = testArray.compactMap { $0.count > 1 ? [$0[1], $0[0]] : nil }
print(flippedElementArray)
[[2.22, 1.123], [4.0, 3.11], [6.19, 5.21]]
The $0.count > 1 check just makes sure you have at least 2 elements so you don't get a bounds error. If not, the closure returns nil for that element, which will be filtered out by compactMap. Otherwise, it returns a new array element consisting of the second element followed by the first element. All of those elements are stored in the final mapped array (flippedElementArray).
arrayOfCoords.map{ Array($0.reversed()) }
I'm assuming you want to reverse each coordinate pair, but keep the pairs themselves in the original order. Since you have an array of arrays, you have to do the operation on each sub-array separately. And reversed() returns a ReversedCollection instead of an Array so you have to re-initialize it into an array to use it properly.
How to update data from one table to another table where A common column value matched (both tables are on different server) , Can we design a SSIS package for such case ?
You can link the server using
exec sp_addlinkedserver [ #server= ] 'server' [ , [ #srvproduct= ] 'product_name' ]
[ , [ #provider= ] 'provider_name' ]
[ , [ #datasrc= ] 'data_source' ]
[ , [ #location= ] 'location' ]
[ , [ #provstr= ] 'provider_string' ]
[ , [ #catalog= ] 'catalog' ]
http://msdn.microsoft.com/en-us/library/ms190479.aspx
and then
select * from [server].[database].[schema].[table]
I have a set of SQL server jobs and I want the schedule for them to be dynamic i.e. I want the next run date to come from a table.
I have tried updating next_run_date in the sysjobschedules table and next_scheduled_run_date in sysjobactivity table but this doesn't do anything.
How would I go about this?
I think you can use - sp_update_schedule to update the schedule.
sp_update_schedule
{ [ #schedule_id = ] schedule_id
| [ #name = ] 'schedule_name' }
[ , [ #new_name = ] new_name ]
[ , [ #enabled = ] enabled ]
[ , [ #freq_type = ] freq_type ]
[ , [ #freq_interval = ] freq_interval ]
[ , [ #freq_subday_type = ] freq_subday_type ]
[ , [ #freq_subday_interval = ] freq_subday_interval ]
[ , [ #freq_relative_interval = ] freq_relative_interval ]
[ , [ #freq_recurrence_factor = ] freq_recurrence_factor ]
[ , [ #active_start_date = ] active_start_date ]
[ , [ #active_end_date = ] active_end_date ]
[ , [ #active_start_time = ] active_start_time ]
[ , [ #active_end_time = ] active_end_time ]
[ , [ #owner_login_name = ] 'owner_login_name' ]
[ , [ #automatic_post =] automatic_post ]