Silverlight: StreamReader works slowly for browser cache - silverlight

In SL4 application I perform a lot of requests (30) to the server in the same time (+~3 sec) by using HttpWebRequest. As a result I receive the same count of responses and start to process streams
HttpWebRequest request = (HttpWebRequest) asynchronousResult.AsyncState;
HttpWebResponse response = (HttpWebResponse) request.EndGetResponse(asynchronousResult)
using (var stream = new StreamReader(response.GetResponseStream()))
{
var str = stream.ReadToEnd();
var doc = XDocument.Parse(str);
}
All responsies are received (server returned 304 - not modified)
For each response program creates new thread (because of asynchronusResult).
When I try to fetch data from stream it takes a lot of time when stream.ReadToEnd() method works. I don't now what inside ReadToEnd() but it seems that stream reading performs in one thread or any locking occurs. Any ideas??
Why it happes? Everything incapsulated in separate threads and works fast except this method.

Presumably you are using the browser http stack, in which case your browser will almost certainly have a limit on the number of connections you can make to a given subdomain at once- in many browser this is two.
A common trick, is to use multiple sub domains to serve content.

Related

SqlFileStream: Returning stream vs byte array in HTTP response

I'm a little confused around the issue of returning a byte array vs a stream in an HTTP Response using .net Web API.
I came across the following code:
SqlConnection conn = new SqlConnection();
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = "Select FileData.PathName() As FilePath, GET_FILESTREAM_TRANSACTION_CONTEXT() AS Context From FileStorage";
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
reader.Read();
string filePath = (string)reader["FilePath"];
byte[] fileBytes = (byte[])reader["Context"];
SqlFileStream stream = new SqlFileStream(filePath, fileBytes, FileAccess.Read);
result.Content = new StreamContent(stream);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
Question 1:
Why would they return a Stream instead of a byte array in the HTTP Response?
Question 2:
Why create a SqlFileStream to read the data if the byte array is already available by calling (byte[])reader["Context"]? Wouldn't this mean that the entire file contents are read into memory? So why the need for a Stream?
Question 1: Why would they return a Stream instead of a byte array in the HTTP Response?
Because the byte array may be huge, so if you read the entire array into the memory of the server and keep it in memory until it has all been transmitted to the client you are imposing a huge memory burden on the server. That's the stuff Denial-Of-Service attacks are made of. By using a stream you allow the server to load the data in small chunks, on an as-needed basis, and to keep only a small chunk in memory at any given time, while waiting for it to be transmitted.
Question 2: Why create a SqlFileStream to read the data if the byte array is already available by calling (byte[])reader["Context"]? Wouldn't this mean that the entire file contents are read into memory? So why the need for a Stream?
The byte array that you see there is not the actual file contents. If you look at the documentation of the constructor of SqlFileStream, and also at the documentation of the SqlFileStream class, this byte array is some "transaction context" which is (a terrible hack) necessary for the database server to read the actual the data from storage. The actual data is potentially huge, so the code that you posted does all this in order to avoid loading it all into memory.
Buffering is the main reason for returning StreamContent. In ASP.NET Web API every time you return StreamContent, your response is not buffered however byte array response is already buffered and available to serve. In the case of byte[] the content of HttpResponseMessage could be set directly from your byte[] and you do not need to convert it to Stream type.
In addition consider using PushStreamContent in scenarios in which you want to stream binary contents to the client continuously so client can consume your api progressively as the data arrives similar to following code snipet:
var httpResponseMessage = new HttpResponseMessage
{
Content = new PushStreamContent(async (respStream, content, context) =>
{
using (var writer = new StreamWriter(respStream))
{
await writer.WriteLineAsync();
await writer.FlushAsync();
}
}, "text/plain")
};

Should I Be Using Async Calls?

I have a c# application which reads a table of roughly 1500 site url's of clients who have been with the company since we started. Basically, I am running whois queries on these url's and seeing if they are still a client or not. The application works but it takes roughly an hour to complete. Would I be better off using async whois queries and how much time roughly could I save.
Here is a sample whois query block of code that I am using.
Also if anyone has any tips on how to improve this code or run async commands could ye please help me out as I'm only an intern. Thanks
string whoisServer = "whois.markmonitor.com";
string data;
try
{
TcpClient objTCPC = new TcpClient(whoisServer, 43);
string strDomain = domainName + "\r\n";
byte[] arrDomain = Encoding.ASCII.GetBytes(strDomain);
Stream objStream = objTCPC.GetStream();
objStream.Write(arrDomain, 0, strDomain.Length);
StreamReader objSR = new StreamReader(objTCPC.GetStream(),
Encoding.ASCII);
//return objSR.ReadLine();
//return (Regex.Replace(objSR.ReadToEnd(),"\n","<br>")).ToString();
using (StreamReader reader = new StreamReader(objTCPC.GetStream(), Encoding.ASCII))
{
data = (reader.ReadToEnd());
}
//test.Add(objSR.ReadLine());
objTCPC.Close();
}
catch
{
data = "Not Found";
}
return data;
Well, the short answer is certainly yes.
Since you are making multiple, completely independent lookups, you have everything to gain by running them in parallel, asynchronously.
There are several ways to do this. The options depend on what version of .net you're in.
As you would guess, there are many examples.
Check these out right here on SO.
Avaliable parallel technologies in .Net
Multi threaded file processing with .NET
When to use a Parallel.ForEach loop instead of a regular foreach?

GAE/Java LocalChannelFailureException at development server

I'm using Channel API (Java) with Google App Engine for my web application. I have implemented a Token-reusing-mechanism for not exceeding the Channel API Quotas that fast.
This means, that my implementation reuses an existing channel for a user that refreshes the page as long as the expiration time of the token received by the ChannelService.createChannel() call, is not over.
When refreshing my page I get the following exception (with x starting at 0 and increasing for every refresh). However, my page continues to work as intended. Is there a way to avoid the exception being thrown? Or can I just ignore the exception?
com.google.appengine.api.channel.dev.LocalChannelFailureException: Client connection with ID connection-x not found.
at com.google.appengine.api.channel.dev.Channel.getClientMessageQueue(Channel.java:79)
at com.google.appengine.api.channel.dev.ChannelManager.getNextClientMessage(ChannelManager.java:300)
at com.google.appengine.api.channel.dev.LocalChannelServlet.doGet(LocalChannelServlet.java:120)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1166)
...
Im reusing tokens with the following classes:
When calling ChannelService.createChannel() I save the expiration date and the generated token in an Entity called "Channel"
public class Channel {
private String id;
private String token;
private Date expiration;
}
Then I have a ChannelService class that returns a valid Channel with its get() method. The channelDAO is a class that just uses a Map for storing Channels. So there is no database persistence, which would keep a token alive over a server restart.
public Channel get(String clientId) {
Calendar calendar = Calendar.getInstance();
Channel channel = channelDAO.get(clientId);
if (channel == null || calendar.getTime().after(channel.getExpiration())) {
com.google.appengine.api.channel.ChannelService channelService = ChannelServiceFactory
.getChannelService();
calendar.add(Calendar.MINUTE, CHANNEL_UPTIME);
String token = channelService.createChannel(player.toString(), CHANNEL_UPTIME);
channel = new Channel(clientId, token, calendar.getTime());
channelDAO.persist(channel);
}
return channel;
}
I fixed the problem by further investigations on the source of the exception. The Channel API works with polling requests that are executed every 500ms. I used Firefox's console to track these. Here is an example poll:
[20:40:15.978] GET http://localhost:8080/_ah/channel/dev?command=poll&channel=920a60f9b27ece1a1ba43d251fdacf2e-channel-eqt3xi-1385927324758-{clientId}&client=connection-2 [HTTP/1.1 200 OK 0ms]
In my question I stated, that the exception occurs on page reload, so the problem with this was: When the page is reloaded, something (I don't know what exactly, but i assume it has something to do with sockets getting closed and reopened on page refresh) happens which causes the client (last parameter of the GET request) to no longer be available. However, a new client is available: the client "connection-{i+1}". So when you enter the page initially, the client is "connection-0". After page refresh it is "connection-1". But as the old page used a delayed execution for the poll, a false request (still connection-0) is sent to the server, that, as a result, throws the Exception.
I fixed the problem by manually cancelling the delayed execution, when leaving the page with jQuery.
var channel = new goog.appengine.Channel('${channel.token}');
var socket = channel.open(handler);
$(window).on('beforeunload', function() {
clearTimeout(socket.pollingTimer_);
});
Your token re-use scheme should be carefully checked for bugs as that exception shouldn't occur each page reload.
There is a known issue after local server restarts but as stated it should only be only if the development server restarted.
I had the same issue using GWT and gwt-gae-channel. The solution would be something like:
Socket socket = channel.open(new SocketListener() {...});
Window.addWindowClosingHandler(new ClosingHandler() {
#Override
public void onWindowClosing(ClosingEvent event) {
socket.close();
}
});

Download File (using Thread class)

Ok, I understand that maybe very stupid question, but i never did it before, so i ask this question. How can i download file (let's say, from the internet) using Thread class?
What do you mean with "using Thread class"? I guess you want to download a file threaded so it does not block your UI or some other part of your program.
Ill assume that your using C++ and WINAPI.
First create a thread. This tutorial provides good information about WIN32 threads.
This thread will be responsible for downloading the file. To do this you simply connect to the webserver on port 80 and send a HTTP GET request for the file you want. It could look similar to this (note the newline characters):
GET /path/to/your/file.jpg HTTP/1.1\r\n
Host: www.host.com\r\n
Connection: close\r\n
\r\n
\r\n
The server will then answer with a HTTP response containing the file with a preceding header. Parse this header and read the contents.
More information on HTTP can be found here.
If would suggest that you do not use threads for downloading files. It's better to use asynchronous constructs that are more targeted towards I/O, since they will incur a lower overhead than threads. I don't know what version of the .NET Framework you are working with, but in 4.5, something like this should work:
private static Task DownloadFileAsync(string uri, string localPath)
{
// Get the http request
HttpWebRequest webRequest = WebRequest.CreateHttp(uri);
// Get the http response asynchronously
return webRequest.GetResponseAsync()
.ContinueWith(task =>
{
// When the GetResponseAsync task is finished, we will come
// into this contiuation (which is an anonymous method).
// Check if the GetResponseAsync task failed.
if (task.IsFaulted)
{
Console.WriteLine(task.Exception);
return null;
}
// Get the web response.
WebResponse response = task.Result;
// Open a file stream for the local file.
FileStream localStream = File.OpenWrite(localPath);
// Copy the contents from the response stream to the
// local file stream asynchronously.
return response.GetResponseStream().CopyToAsync(localStream)
.ContinueWith(streamTask =>
{
// When the CopyToAsync task is finished, we come
// to this continuation (which is also an anonymous
// method).
// Flush and dispose the local file stream. There
// is a FlushAsync method that will flush
// asychronously, returning yet another task, but
// for the sake of brevity I use the synchronous
// method here.
localStream.Flush();
localStream.Dispose();
// Don't forget to check if the previous task
// failed or not.
// All Task exceptions must be observed.
if (streamTask.IsFaulted)
{
Console.WriteLine(streamTask.Exception);
}
});
// since we end up with a task returning a task we should
// call Unwrap to return a single task representing the
// entire operation
}).Unwrap();
}
You would want to elaborate a bit on the error handling. What this code does is in short:
See the code comments for more detailed explanations of how it works.

Dataservice authentication not working with serviceContext.GetReadStreamUri(..)

I have an Odata Service and a WPF client application.
Some of the Odata Service Entities have images attached to them (ie.Client).
The streaming works as long as I do not apply authentication. I can view and change the images. Once I enforce authentication everything works as expected, given the credentials check out. All but the images that is. Here are the relevant code steps / snipes.
Window Constructor code
bool iv = System.Web.Security.Membership.ValidateUser("userName", "pass");
ManageService = new InventoryContext(new Uri(...));
ManageService.SendingRequest += new EventHandler<SendingRequestEventArgs (ManageService_SendingRequest);
ManageService_SendingRequest code
//attach the authentication cookie to the request header
((HttpWebRequest)e.Request).CookieContainer = ((ClientFormsIdentity)Thread.CurrentPrincipal.Identity).AuthenticationCookies;
The call to fetch the data is async using background worker
Query Methode()
BackgroundWorker worker = new BackgroundWorker();
worker.DoWork += new DoWorkEventHandler(FetchClient);
worker.RunWorkerCompleted += new RunWorkerCompletedEventHandler(FetchClientsCompleted);
worker.RunWorkerAsync(ClientUUID);
FetchClient
var query = from o in ManageService.Clients where o.ClientUUID.Equals((Guid)e.Argument)
...
e.Result = query;
FetchClientsCompleted
var res = e.Result as DataServiceCollection<Client>;
DataContext = res[0]; //this is all working, with and without authentication
//the next line, binding the stream to the image throws 'unauthenticated'
//it works well if authentication is disabled
imgClient.Source = new BitmapImage(ManageService.GetReadStreamUri(DataContext));
if I debug, the SendingRequest methode, usually called with any query request is NOT triggered calling GetReadStreamUri(...).
This is where I am stuck, what to do to authenticate to the service to get the stream?
Also, I took the URI generated by ManageService.GetReadStreamUri(DataContext), past it into the browser and it works, the image is displayed in the browser, if logged in.
Anyone any ideas?
The SendingRequest handler will only fire for request sent by the DataServiceContext class (your ManageService). But in the case of the picture, you only get the URL from the DataServiceContext and then let the BitmapImage actually issue the HTTP request to that URL. So the event won't fire for that request. I don't know if BitmapImage has a way for you to hook into the HTTP request pipeline (I don't think it does).
You could issue that request yourself and then use the response stream as the input for the bitmap image, in which case you get full control over the request and thus can implement authentication as appropriate.

Resources