How can I make bulkCopy map the same columns? - sql-server

Suppose I have a SQL Table that has these columns:
[server_name],[SESSION_ID],[SESSION_SPID]
I am trying to copy values stored in a data table ($dmvResult) to the SQL Table above ($Table)
$dmvResult = DMV_Query 'SELECT [SESSION_ID]
,[SESSION_SPID]
FROM $SYSTEM.DISCOVER_SESSIONS';
$ConnectionString ='Data Source=$server; Database=$database; Trusted_Connection=True;'
$bulkCopy = new-object Data.SqlClient.SqlBulkCopy($ConnectionString)
$bulkCopy.DestinationTableName=$Table
$bulkCopy.WriteToServer($dmvResult)
While the copying is being done successfully, there is an issue: it's copying by position, not by column name match. In other words, the copied columns are not being mapped and copied to the same columns.
[SESSION_ID] is being copied to [server_name] and
[SESSION_SPID] is being copied to [SESSION_ID]
How can I tell bulkCopy to match columns and copy?
The result copy should be [server_name] being empty because it wasn't selected from DMV query.
I found a neat solution in this thread:
https://stackoverflow.com/a/20045505/8397835
but I dont know how to translate it to my powershell code:
var meta = definition.Context.Mapping.GetMetaType(typeof(T));
foreach (var col in meta.DataMembers)
{
copy.ColumnMappings.Add(col.Member.Name, col.MappedName);
}
EDIT: foreach column.ColumnName output
EDIT2:
i tried this:
$dmvResult.Columns |%{$_.Name}
and it doesnt output anything.
before you say $dmvResult data table must be empty then, explain how is it possible that this actually works and copies in data?
$bulkCopy.ColumnMappings.Add('SESSION_ID', 'SESSION_ID')
$bulkCopy.ColumnMappings.Add('SESSION_SPID', 'SESSION_SPID')
$bulkCopy.WriteToServer($dmvResult)
and for some reason, its outputting this to the console as well:
so the data table $dmvResult is clearly populated.
i was hoping instead of defining mapping for every single column like this:
$bulkCopy.ColumnMappings.Add('SESSION_SPID', 'SESSION_SPID')
instead there would be anutomatic option like this:
foreach ($column in $dmvResult.Columns) { $bulkCopy.ColumnMappings.Add($column.ColumnName, $column.ColumnName)}
but that throws exception:
Exception calling "WriteToServer" with "1" argument(s): "The given
ColumnMapping does not match up with any column in the source or
destination."

A very weird solution but I just had to add a comma here before $dataset:
,$dataSet.Tables[0]
in the DMV_Query function
and then i used this foreach loop
foreach ($column in $dmvResult.Columns) { $bulkCopy.ColumnMappings.Add($column.ColumnName, $column.ColumnName) > $null }
and it worked!
it now maps the columns automatically!!

Related

Use Perl to select multiple query output data from SQL database and place into single Excel sheet

I am new to Perl so I'm hoping someone can help me with this,
I'm working on a special validation project where we extract data from database and show as an report in a excel. I have a Perl script to extract data from database and place into excel. But i tried to exact data from multiple query and to display the multiple output in a single excel sheet.
How do i return multiple query output in a single excel sheet?
use strict;
use DBI;
use Excel::Writer::XLSX;
# connect to the db
my $dbh = DBI->connect('dbi:ODBC:dbname', 'dblogin', 'dbpassword');
my $sth1 = $dbh->selectall_arrayref(" select col1, col2 from table");
foreach my $ln (#$sth1) {
my($col1, $col2) = #$ln;
print "$col1 $col2\n";
}
my $workbook = Excel::Writer::XLSX->new( 'col1col2test.xlsx' );
my $worksheet = $workbook->add_worksheet();
$worksheet->write( "A1", "$col1" );
$worksheet->write( "A2", "$col2" );
If that is your code, then it will be throwing errors complaining about $col1 and $col2 needing an "explicit package name".
This means that you haven't declared the variables correctly. You declare variables using the keyword my. But my declares the variables within a block of code. So when you write this:
foreach my $ln (#$sth1) {
my($col1, $col2) = #$ln;
print "$col1 $col2\n";
}
the variables are only declared within that block. And when you try to use them outside the block:
$worksheet->write( "A1", "$col1" );
$worksheet->write( "A2", "$col2" );
You will get the error I mentioned above.
The fix for this is to declare the variables at the correct level - that is outside of the loop.
my ($col1, $col2);
foreach my $ln (#$sth1) {
($col1, $col2) = #$ln;
print "$col1 $col2\n";
}
For future reference, when asking for help on a site like this it's really useful if you give us all the useful information - particularly any error messages that you see.

using WPDB to display external database-data inside a WP-shortcode

I'm trying to figure out a way to use WPDB to load a whole row or single cells/fields from another table (not the Wordpress-DB) and displaying them in a shortcode. I have a bunch of weatherdata-values, I need the latest row (each column is another data-type (temp, wind, humidity, etc) of the database for a start.
Sadly, the plugin that would do everything that I need, SQL Shortcode, doesn't work anymore. I found this now:
https://de.wordpress.org/plugins/shortcode-variables/
Though I still need to use some PHP/PDO-foo to get the data from the database.
By heavy copy&pasting I came up with this:
<?php
$hostname='localhost';
$username='root';
$password='';
$dbname='sensordata';
$result = $db->prepare(SELECT * FROM `daten` WHERE id=(SELECT MAX(id) FROM `daten`););
$result->execute();
while ($row = $result->fetch(PDO::FETCH_ASSOC))
{
$data = $row['*'];
}
echo $data;
?>
But obviously it's not working. What I need to get it done with WPDB?
kind regards :)
Just in case anyone else needs this in the future. I used this now:
//connect to the database
<?php
$dbh = new PDO('mysql:host=localhost;dbname=databasename', 'dbuser',
'dbpasswort');
//query the database "databasename", selecting "columnname" from table "tablename", checking that said column has no NULL entry, sort it by column "id" (autoincrementing numeric ID), newest first and just fetch the last one
$sth = $dbh->query("SELECT `columnname` FROM `tablename` WHERE `columnname` IS NOT NULL order by id desc limit 1")->fetchColumn(0);
//print the value/number
print_r($sth);
?>
By using "SELECT colum1, colum2,... FROM" You should get all the columns, could be that fetchColumn needs to be replaced with something different though.

Working with the data from LinQ SQL query

Using VS 2013 (VB) and SQL server 2016.
I have linQ query that returns two columns from a database. The query is as follows.
Dim val = (From value In db.ngc_flowTypes
Where value.defaultValue IsNot Nothing
Select value.flowName, value.defaultValue)
The data it returns is a as follows.
I want to iterate through each row of the results and pass the values to certain variables. A ForEach statement doesnt seem to work as it just runs through once. I am sure this must be easy but I ont quite understand it. Am I getting the data returned in the best way via my query? Can I transpose the data to a data table in VB? so I can work with it easier?
The end result I want is string for each flow name with its corresponding default value (along with some other text). So something like this.
dim strsubmission as string = flowName + " has a value of " + defaultValue
Use ToDictionary.
Dim val = (From value In db.ngc_flowTypes
Where value.defaultValue IsNot Nothing
Select value).ToDictionary(Function(key) key.flowName,
Function(value) value.defaultValue)
This will actually execute the SQL of the linq on the database (approx. Select * From ngc_flowTypes Where defaultValue Is Not NULL), traverse each record into a key/value pair (flowName, defaultValue) and put it into a in-memory dictionary variable (val).
After that you can do whatever you like with the dictionary.
For Each flowName In val.Keys
Console.WriteLine("{0} has a value of {1}", flowName, val(flowName))
Next
Edit:
This will only work as long flowName is unique in table ngc_flowTypes

SQL Server 2016 SSIS get cursor from stored procedure

I am using SQL Server 2016.
I have a stored procedure GET_RECORDS that takes input parameters for filter and outputs a CURSOR parameter
I want to get this cursor in my SSIS package
I had created data flow task, OleDb source and variables for parameter values. Then mapped parameters
Params mapping screen
but when I wanted to save the component - I got an error
error screen
I tried to add clause WITH RESULT SETS with some dummy columns, but my procedure doesn't return any result set
What am I doing wrong?
Any advices will be helpful.
Thank you.
With regards, Yuriy.
The source component is trying to determine what columns and types will be returned. Because you are using dynamic SQL the metadata can change each time you run it.
With result sets allows you to define the data being returned but should only be used if you are guaranteed to have those results every time you execute.
EDIT:
I create a connection and run the command so that it populates a data table. Then I put the column headers into a string array. There are plenty of examples out there.
Then I use the following function to create a destination table. Finally I create a datareader and pass that to the .Net SqlBulkCopy. Hope this helps.
private void CreateTable(string TableName, string[] Fields)
{
if (TableExists(TableName) && Overwrite)
{
SqlCommand = new SqlCommand($"Drop Table [{TableName}]", SqlConnection);
SqlCommand.ExecuteNonQuery();
}
string Sql = $"Create Table [{TableName}] (";
int ColumnNumber = 1;
foreach (string Field in Fields)
{
string FieldValue = Field;
if (! HasHeaders)
{
FieldValue = "Column" + ColumnNumber;
ColumnNumber++;
}
Sql += $"[{FieldValue}] Varchar(8000),";
}
Sql = Sql + "ImportFileID Int, ID Int Identity(1,1) Not Null, Constraint [PK_" + TableName + "] Primary Key Clustered ([ID] Asc))";
SqlCommand = new SqlCommand(Sql, SqlConnection);
SqlCommand.ExecuteNonQuery();
}
Use ado.net source instead of oledb source, define a simple select and get the columns you wish to return. Now you can define expresión in the dataflow properties.
Search ado.net source dynamic sql
:)
try to return the records and use foreach in ETL instead of cursor
https://www.simple-talk.com/sql/ssis/implementing-foreach-looping-logic-in-ssis/
I think you can do it from a simple way, but I don't know what you are you doing, exactly...

SQLBulkCopy: Does Column Count make difference?

I try to search but didn't found answer to relative simple thing. I have a CSV, that doesn't have all the column as in my database table, as well as it miss the auto increment, primary key in CSV too.
All I did is I read CSV into the DataSet, and then run a traditional SQLBulkCopy code to read the first table of dataset to database table. But it give me following error:
The given ColumnMapping does not match up with any column in the source or destination.
My code for bulkcopy is
using (SqlBulkCopy blkcopy = new SqlBulkCopy(DBUtility.ConnectionString))
{
blkcopy.EnableStreaming = true;
blkcopy.DestinationTableName = "Project_" + this.ProjectID.ToString() + "_Data";
blkcopy.BatchSize = 100;
foreach (DataColumn c in ds.Tables[0].Columns)
{
blkcopy.ColumnMappings.Add(c.ColumnName, c.ColumnName);
}
blkcopy.WriteToServer(ds.Tables[0]);
blkcopy.Close();
}
I add Mapping to test, but it doesn't make difference to remove mapping part. If we remove mapping that it try to match column in order and since column are different in count they end up mismatch datatype and lesser column values etc. Oh yes the column names from CSV does match that from Table, and are in same case.
EDIT: I change the mapping code to compare the column name from live DB. For this I simply run a SQL Select query to fetch 1 record from database table and then do following
foreach (DataColumn c in ds.Tables[0].Columns)
{
if (LiveDT.Columns.Contains(c.ColumnName))
{
blkcopy.ColumnMappings.Add(c.ColumnName, c.ColumnName);
}
else
{
log.WriteLine(c.ColumnName + " doesn't exists in final table");
}
}
I would dump the results of CSV into a staging SQL table...and then do a simple insert from staging table to production table.
also do a simple Import of CSV into SQL Table, maybe there are some empty/invalid columns within CSV file.
I once had this problem and the cause was a difference in the case of the column names. One of the columns was "Id", but in the DB it was "id".

Resources