How can I create multiple server streams for a single user in gatling grpc load scripts - gatling

What do I want to achieve?
For each user, I want to create multiple server streams and for each stream I want to ensure the response received has a certain state.
What behaviour am I seeing?
The user is able to successfully create the first stream but runs into the following error when creating subsequent streams.
Unable to create a new server stream with name stream1: already exists 12:07:56.809 [gatling-1-1] ERROR c.g.p.g.g.a.ServerStreamStartAction - 'serverStreamStart-2' failed to execute: Unable to create a new server stream with name stream1: already exists
What have I tried?
See below for an excerpt of what my code looks like
class StreamingSimulation {
private final val sc = scenario("simulation-lt-scenario")
.feed(
feederFor("payload", requestHelper.getStreamingRequest()
)
.exec(
grpc("action request").serverStream("stream1")
.start(GrpcService.getMyStreamingMethod)($("payload"))
.check(extract { r: MyResponse => Option(r.someState) } is STATE_A))
)
setUp(
scn.inject(rampUsers(1))
. throttle(
reachRPS(throttleRPS) in (60 seconds)
holdFor(300 seconds)
)
.maxDuration(300 seconds)
.protocols(grpc(configureChannel(config)))
)
}
Looks like a user is not supposed to create more than 1 stream using this setup. I am wondering if there is an alternative approach I can take that will allow me to reach my desired behaviour?

Related

DolphinDB Exception: System handle is not able to serialize

I have deployed a DolphinDB cluster successfully like this:
File cluster.nodes:
localSite,mode
192.168.1.112:1210:agent1,agent
192.168.1.112:1221:DFS_NODE1121,datanode
192.168.1.112:1222:DFS_NODE1122,datanode
192.168.1.112:1223:DFS_NODE1123,datanode
192.168.1.112:1224:DFS_NODE1124,datanode
192.168.1.112:1225:DFS_NODE1125,datanode
192.168.1.120:1210:agent2,agent
192.168.1.120:1221:DFS_NODE1201,datanode
192.168.1.120:1222:DFS_NODE1202,datanode
192.168.1.120:1223:DFS_NODE1203,datanode
192.168.1.120:1224:DFS_NODE1204,datanode
192.168.1.120:1225:DFS_NODE1205,datanode
Then I connected to server 192.168.1.112:1221 and tried to call function rpc in this way:
dbDir = 'dfs://valueDB'
schema = table(100000000:0, ["devId", "temperature"], ["INT", "DOUBLE"] )
if(existsDatabase(dbDir))
dropDatabase(dbDir)
db = database(dbDir, VALUE, 0..99)
dev = db.createPartitionedTable(schema,`dev,`devId)
dev.append!(table(1..10 as devId, 20.5+rand(10,10) as temperature))
rpc("DFS_NODE1122", loadTable, db, `dev)
But the system threw the exception message like this:
Execution was completed with exception
System handle is not able to serialize.
So, how to call rpc correctly ?
Like a file handle or a socket handle, a database handle is valid on the node where it was opened. For this reason, DolphinDB disables serializing a database handle to a remote data node.
loadTable accepts a database directory as well. Please modify your code as follows:
rpc("DFS_NODE1122", loadTable, dbDir, `dev)
But the above code doesn't make any sense. Since the table `dev is a distributed table, you can load it on any data node.

Storing chat conversations in sql database using signalR

I'm developing a class library that contains generic methods for these scenarios:
Live support chat (1 on 1 private text chat, with many admins and guests)
Rooms with many users where you can send broadcast and private messages
These two features above are already implemented and now it's necessary for my application to save messages.
My question is, what is the best way to store chat conversations in a SQL database:
Everytime I click send, I insert the message in the database?
Create a List for each user and everytime I click send, the message is saved on the list of the user who sent the message. Then if a user disconnects, I'm going to iterate the list of messages and for each message insert all of them in the db.
Are there other solutions?
What I'm doing now is the following. I have this method which is located on my Hub class:
public void saveMessagetoDB(string userName, string message)
{
var ctx = new TestEntities1();
var msg = new tbl_Conversation {Msg = message};
ctx.tbl_Conversation.Add(msg);
ctx.SaveChanges();
}
I call this method saveMessagetoDB on my client side HTML file like this:
$('#btnSendMessage').click(function () {
var msg = $("#txtMessage").val();
if (msg.length > 0) {
var userName = $('#hUserName').val();
// <<<<<-- ***** Return to Server [ SaveMessagetoDB ] *****
objHub.server.saveMessagetoDB(userName, msg);
SignalR is great for a chat application and you wouldn't even need to store anything in SQL unless you want to create a transcript of the chat at a later time (which may not even be necessary).
I suggest getting the chat working with SignalR first (don't do anything with sql). Then once that is working you can put SQL logging as necessary in your signalR hub.
It most likely makes the most sense to write to sql on each message.
If you decide to store the chats in a database, then you will need to insert/update the messages as they happen.
If you are using the PersistentConnection then you can hook into the OnReceivedAsync event and insert / update data from that event:
protected override Task OnConnectedAsync(IRequest request, string connectionId)
{
_clients.Add(connectionId, string.Empty);
ChatData chatData = new ChatData("Server", "A new user has joined the room.");
return Connection.Broadcast(chatData);
}
Or in the SignalR class that inherits from Hub, you can persist to the Db right before you have notified any clients.

How to schedule SSIS import when the data source is daily excel sheet in email?

Hi I have a specific question regarding scheduling SSIS import:
I have a data source which will send me scheduled excel sheet to my email inbox on daily basis. The expectation is to find a solution which will take this daily excel sheet email into SSIS and schedule importing into SQL on its own.
Is it possible at all? If anyone could provide some useful links or where shall I start to look into, it will be much appreciated.
Thank you
Sense there was no answer to the mail client yet I am just going to throw this out there.
This will work on Gmail and is configured for such.
First thing first you have to make sure that you enable POP (this will allow the process to read your inbox). It is suggested that you select "enable pop from now on" as it will only allow the viewing of items from that point forward.
Once you have done that you need to get the Nuget package for OpenPOP.net
now the fun part. Please keep in mind this is not proper coding practice and you are responsible for the addition of necessary security precautions and error handling. This is purely a proof of concept.
using OpenPop.Pop3;
using OpenPop.Mime;
using OpenPop.Mime.Header;
// create the client to be used
Pop3Client client = new Pop3Client();
// connect to the client via host server port and use ssl bool
client.Connect("pop.gmail.com", 995, true);
// log into the specific account to read
client.Authenticate("username", "password");
// generate count of emails in the inbox
int msgcount = client.GetMessageCount();
//loop thru available message numbers via message count
//incremented at the end of the while loop
while (msgcount > 0)
{
// gets the message header info to, from and subject ect.
MessageHeader header = client.GetMessageHeaders(msgcount);
//read the subject line
string subject = header.Subject;
//compare subject to identify the correct email
if (subject.ToLower() == "subject to match")
{
// gets message info based on message number from msgcount
var message = client.GetMessage(msgcount);
// creates list of the attachments available in the message
List<MessagePart> attachments = message.FindAllAttachments();
//loops thru attachments
foreach (var file in attachments)
{
//assigns filename as string for stream
string filename = file.FileName;
//create a stream to download the file
var stream = new FileStream(#"destination path" + filename,FileMode.Create,FileAccess.ReadWrite);
// downloads file
file.Save(stream);
// closes stream to protect system and hung files
stream.Close();
}
// optional and must be configured to be allowed in your
// email client.
client.DeleteMessage(msgcount);
}
// increment message number
msgcount--;
}
// this is extremely important if deleting or manipulating files in inbox
//the above deletemessage command only marked the message to be deleted.
// you must commit the change to have it take effect.
// this command commits the changes you have made.
// this also closes your client connection so that no connections are left open.
client.Dispose();
This can be added to a script task in your import and you can then download the excel file and import it as you normally would if you would have manually pulled the file and placed it on a hard disk or network drive.

How to read a text file continuously with SQL Server?

Anyone would have an idea how to read a text file (i.e. a log file being populated continuously) from SQL Server and import it continuously into a SQL Server table ?
I would like to use only T-SQL, within a stored procedure for instance.
I have not found any option in BULK INSERT or OPENROWSET other than to read the whole file at once. I would have to do it at once repeatedly and look for not yet imported rows.
That's a possibility but no very efficient if the file gets large.
Is it possible to read only the latest rows at each run?
Thanks !
Philippe
You could use the FileSystemWatcher in order to get notified when the log file changes
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = #"C:\PathOfTheLogfile";
watcher.Filter = "MyLogfile.txt"; // You can also use wildcards here.
watcher.NotifyFilter = NotifyFilters.LastWrite;
watcher.Changed += new FileSystemEventHandler(Watcher_Changed);
watcher.Created += new FileSystemEventHandler(Watcher_Changed);
watcher.EnableRaisingEvents = true; // Start watching.
...
private static void Watcher_Changed(object source, FileSystemEventArgs e)
{
if (e.ChangeType == WatcherChangeTypes.Created) {
//TODO: Read log from beginning
} else {
//TODO: Read log from last position
}
//TODO: write to MSSQL
//TODO: remember last log file position
}
Sometimes the FileSystemWatcher events fires when the file is still being written to. You might have to add a delay before reading the log file.
You might be able to use a linked server to the text file if your file is parseable by the Jet provider.

Parse csv or excel file in Silverlight

I've parsed these files in regular C# applications, but the IO methods for the files are different in Silverlight, and I can't seem to find the right methods. Searches haven't turned up any information I can use. For the real application I'll be receiving XML from the server, but for the prototype I just need to parse a file with some sample data in it.
You can save the Excel file as XML. An example can be found in this link
This way you can keep your import procedure the same and process the data as when you go live.
To access files from the user's machine you are required to use the OpenFileDialog and SaveFileDialog. Without elevated trust (requires out of browser apps) you will not be able to know anything more than the filename the user selected for input/saving; you will have no idea what the path is to this file. This function can only be called as the result of a user taking an action such as clicking a button; otherwise it will fail because Silverlight does not want malicious code prompting a user with annoying dialogs automatically.
To do this you would do something as follows:
var openFile = new OpenFileDialog();
if ( open.ShowDialog() == true ) // Sadly this is a nullable bool so this is necessary
{
using( Stream myStream = openFile.File.OpenRead() )
using ( var reader = new StreamReader( myStream ))
{
...
}
}

Resources