client and server + sorting - c

In my program, I'm getting username as startup parameter and getting the user-id from server.
Sever and client are connected!
But I need to put the username and the user-id(that belongs to the username) together, send them back to the server and to the end sort them with respect to user-ids. Any suggestions?

Just send them both to the server, taking care to include some kind of delimiter so it's possible to parse the stream and identify which part is which.
For instance, you could separate the two with a NIL-byte, with a final NIL after the userid, too. This assumes that both are strings.
On the server, just store the incoming data into an array of structs, and sort the array with qsort() when you have all the data collected.

Related

Best way to handle large amount of inserts to Azure SQL database using TypeORM

I have an API created with Azure Functions (TypeScript). These functions receive arrays of JSON data, converts them to TypeORM entities and inserts them into Azure SQL database. I recently ran into an issue where the array had hundreads of items, and I got an error:
The incoming request has too many parameters. The server supports a maximum of 2100 parameters. Reduce the number of parameters and resend the request
I figured that saving all of the data at once using the entity manager causes the issue:
const connection = await createConnection();
connection.manager.save(ARRAY OF ENTITIES);
What would be the best scalable solution to handle this? I've got a couple ideas but I've no idea if they're any good, especially performance wise.
Begin transaction -> Start saving the entities individually inside forEach loop -> Commit
Split the array into smaller arrays -> Begin transaction -> Save the smaller arrays individually -> Commit
Something else?
Right now the array sizes are in tens or hundreads, but occasional arrays with 10k+ items are also a possibility.
One way you can massively scale is let DB deal with that problem. E.g. use External Tables. DB does the parsing. Let your code only orchestrates.
E.g.
Make data to be inserted available in ADLS (Datalake):
instead of calling your REST API with all data (in body or query params as array), caller would write the data to ADLS-location as csv/json/parquet/... file. OR
Caller remains unchanged. Your Azure Function writes data to some csv/json/parquet/... file in ADLS-location (instead of writing to DB).
Make DB read and load the data from ADLS.
First `CREATE EXTERNAL TABLE tmpExtTable LOCATION = ADLS-location
Then INSERT INTO actualTable (SELECT * from tmpExtTable)
See formats supported by EXTERNAL FILE FORMAT.
You need not delete and re-create external table each time. Whenever you run SELECT on it, DB will go parse the data in ADLS. But it's a choice.
I ended up doing this the easy way, as TypeORM already provided the ability to save in chunks. It might not be the most optimal way but at least I got away from the "too many parameters" error.
// Save all data in chunks of 100 entities
connection.manager.save(ARRAY OF ENTITIES, { chunk: 100 });

Import data from multiple MySQL data sources via SSIS?

I have a particularly challenging situation that I could use some assistance with.
I work for a manufacturing facility and am working on a proof of concept.
I have a number of client devices (PIs) fixed to manufacturing equipment, all collecting data from the equipment and storing this data locally within an embedded MySQL database on the device. I would like to import the data from each of the devices, into a central Microsoft SQL Data Warehouse. I would prefer this to be pulled from the devices by the server, rather than being pushed from the client devices.
I would then like the embedded database on the device to be updated / purged, to prevent the same data from being resent (initially I was thinking a date field in a table which I just timestamp once that record has been copied).
My feelings are that a SSIS package would be the way to go here, I have IP addresses and connection information for the PIs in a table within the DW, and so would like to connect to each client in turn to import the data and update it.
Is there a way to change a connection string on the fly within SSIS? OR would there be a better way to achieve this - maybe via a sproc on the DW?
I'm ok with sprocs, but very new to SSIS. If you have any links/tutorials/posts that may help, please share. Thanks.
EDIT: This is what I already have
Here are my variables:
As you can see it is showing an error when attempting to run on the first step.
Also, FWIW, here's the progress output...
Is there a way to change a connection string on the fly within SSIS?
Use a variable to store the connection string, and use that variable to populate the Expression value of the connection string. Then when you change the value of the variable, you will change the value of the connection string.
Its not the answer but something like this.
A) you create a table all the IP address, and connection strings.
B) SSIS create variables for each property i.e Variable IPAddress.
C) Create Execute SQL Task; Set Full Result set.
Also Add Result Set: Result Name: 0 Varaible Name Rows.
D) Create another Variable Rows: DataType System.Object.
E) Add Foreach Loop Container: ADO: Rows
Variable Mapping: IPAddress
F) Create Source Connection Manager
Expression set the connection as of your Variables
G) Add a Data Flow Task and fetch the data from each connection.

Ensure SQL Server sensitive data authenticity

I have an odd question that needs a creative answer.
I have code a program that writes sensitive data on a SQL Server Table with 3 columns.
Every time this program starts I need to check this data.
The problem is that I need a way to check that the data on this 3 fields was writed by my process and not manipulated nor copied externally or by other process.
So if the data on any of the 3 fields was modiffied externally, my code should not recognize the data. Also, if the data was copied from other server, neither should be recornized.
What I have in mind:
1) Create a secret private key with unique data from SQL Instance.
2) Create a binary field on the table.
3) When the data is witted, fill the binary field with PwdEncrypt function and private key as data.
4) When data is readed, use PwdCompare to check if the data on binary field match private key.
Now, how can I ensure that other fields are not modified?
I need this to work on several servers that use from SQL Server Express 2008 R2 to SQL Server Standard 2016
Thanks!
Your approach is pretty much correct, but you don't need anything asymmetric here, a simple HMAC will do.
When modifying any row of a table that requires data authenticity, concatenate the binary values of all fields that you want authenticated and run the final binary string through an HMAC with a secret key stored only with your process.
Do the same again when checking to see if the row is valid and compare the two resulting hashes using a time-safe check. If they don't match, something has been tampered with.

Extracting user details from a DB and using these to login via Jmeter

I have a db full of users (3 million) that I want to use during testing, I do not want to create or use the csv file method for this. I would like to pick a new user each time to login with.
In my test plan I have a once only controller where I have put a JDBC connection to my DB and two JDBC processors
one that counts the number of users in the DB $users
one that selects all the emails in the DB that fit a certain criteria I return this as an an array of emails $emails
What I am struggling with is using these to construct a loop that selects a different email each time so I can pass these as inputs to my login requests. Each thread should use a different email address.
Are you setting the variable names textbox in your JDBC sampler?
You can use the _index notation to access the results returned by the JDBC sampler. For example, if your query returns a single column of data, and you put emails in the variable names textbox, then:
emails_# will contain the number of result rows
emails_1 will contain the first row value
emails_2 will contain second row value and so on..
ref: JDBC Sampler
You could use the following SQL query to get random email:
SELECT email FROM table WHERE your_criteria = 'something' ORDER BY RAND() LIMIT 1;
directly in the JDBC PreProcessor where random email is required.
Pre/Post Processors and Timers processing time is not included into response time by default so it is quite "safe" operation (unless you use Transaction Controller).

Tool or method to parse an incoming e mail, strip the data and push into a database

has anybody done anything with scripts to parse an incoming e mail to a specific address, strip out the data and insert it into a SQL database?
The e mail would be coming in through exchange 2003 server and would be in a known fixed format, ie
Name: Firstname Surname
ID Number: nnnnnnn
etc.
etc.
Ideally the solution would need to operate on the server and not a client. Any advice appreciated.
Originally posted on serverfault but cross posted here for the scripting angle.
Yes I have. What are you trying to accomplish with this?
If you want to know how to parse it, the easiest way I have found so far is to build a simple scraper/pattern matcher for your fixed format that doesn't change.
I generally have iterated though each line of the email looking for a specific element/identifier, after which i read so many characters to a variable for the row to be committed to the database.
SO:
Download all emails
loop through each email
loop through each line in each email
find each element, one at a time, in incremental order
Delete emails from server..

Resources