Injecting serverside data when using ISpaBuilder.UseReactDevelopmentServer - reactjs

When using ASP.NET (Core, .NET 5) MVC's IApplicationBuilder.UseSpa / ISpaBuilder.UseReactDevelopmentServer (in development), is there a way to postprocess the index HTML before it's sent to the browser? I need to inject a script tag holding data about the currently auth'd user to be consumed by the React app.
I want to avoid having to do an extra call from inside my React app just to get the currently logged on user at startup.

You can use custom middleware to do that. Assuming you're on .net core 3.1, the middleware would look something along these lines:
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;
using System.IO;
using System.IO.Compression;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
namespace TestReactDevServer.Middleware
{
public class ScriptInjectorMiddleware
{
private readonly RequestDelegate _next;
public ScriptInjectorMiddleware(RequestDelegate next)
{
_next = next;
}
public async Task InvokeAsync(HttpContext context)
{
//Save pointer to the original response body stream
var originalBodyStream = context.Response.Body;
//Create new memory stream
using (var responseBody = new MemoryStream())
{
//...and use it for subsequent requests so we can peek the contents
context.Response.Body = responseBody;
//Continue down the Middleware pipeline, eventually returning to this class
await _next(context);
//inspect response, inject script
await InjectScript(responseBody, "window.myUser='123';");
//copy the contents of the new memory stream to the original place
await responseBody.CopyToAsync(originalBodyStream);
}
}
private async Task<Stream> InjectScript(Stream input, string script)
{
input.Seek(0, SeekOrigin.Begin);
var decompressed = new MemoryStream();
using (var tmp = new GZipStream(input, CompressionMode.Decompress, true))
{
tmp.CopyTo(decompressed);
}
var html = await decompressed.StreamToString();
var modifiedHtml = Regex.Replace(html, "</body>[\\n\\r]+</html>", $"<script type=\"text/javascript\">{script}</script></body></html>", RegexOptions.IgnoreCase | RegexOptions.Multiline); // any way to locate closing tags will work here, you probably can be more efficient
input.Seek(0, SeekOrigin.Begin);
using (var modifiedHtmlStream = modifiedHtml.ToStream())
using (var tmp = new GZipStream(input, CompressionMode.Compress, true)) // might be optional
{
modifiedHtmlStream.CopyTo(tmp);
}
return input;
}
}
public static class ScriptInjectorMiddlewareExtensions
{
public static IApplicationBuilder UseScriptInjectorMiddleware(
this IApplicationBuilder builder)
{
return builder.UseMiddleware<ScriptInjectorMiddleware>();
}
}
public static class StreamExtensions {
public static async Task<string> StreamToString(this Stream stream) {
stream.Seek(0, SeekOrigin.Begin);
return await new StreamReader(stream).ReadToEndAsync();
}
public static Stream ToStream(this string str)
{
var stream = new MemoryStream();
var writer = new StreamWriter(stream);
writer.Write(str);
writer.Flush();
stream.Seek(0, SeekOrigin.Begin);
return stream;
}
}
}
a couple of things to point out:
Testing this with Chrome, I ended up having to decompress the proxied response and compress it back after modification - I think compression step might be optional.
Depending on your user agent you might need to handle more compression cases (see more examples on Github)
You will need to inject this middleware before your call to .UseSpa() in Startup.cs: adding app.UseUserInjectorMiddleware(); should pick up the included extension method
I suspect this example is far from being complete, especially in terms of handling different encodings and content types - I am hoping you'd be able to adapt the idea to your use case.

Related

About load supported cultures from DB in .NET CORE

I have a Language entity with all supported languages in my db, each language has a culture string attribute. I want to load supported cultures from DB.
In my service initializer I have it:
public void ConfigureServices(IServiceCollection services)
{
// ... previous configuration not shown
services.Configure<RequestLocalizationOptions>(
opts =>
{
var supportedCultures = new List<CultureInfo>
{
new CultureInfo("en-GB"),
new CultureInfo("en-US"),
new CultureInfo("en"),
new CultureInfo("fr-FR"),
new CultureInfo("fr"),
};
opts.DefaultRequestCulture = new RequestCulture("en-GB");
// Formatting numbers, dates, etc.
opts.SupportedCultures = supportedCultures;
// UI strings that we have localized.
opts.SupportedUICultures = supportedCultures;
});
}
How I can access my DB context inside it?
There is any other better way to do it?
I don't think there's an out of the box solution for this.
However, you can implement your own middleware that achieves this by using ASP.Net's RequestLocalizationMiddleware:
public class CustomRequestLocalizationMiddleware
{
private readonly RequestDelegate next;
private readonly ILoggerFactory loggerFactory;
public CustomRequestLocalizationMiddleware(RequestDelegate next, ILoggerFactory loggerFactory)
{
this.next = next;
this.loggerFactory = loggerFactory;
}
public async Task Invoke(HttpContext context /* You can inject services here, such as DbContext or IDbConnection*/)
{
// You can search your database for your supported and/or default languages here
// This query will execute for all requests, so consider using caching
var cultures = await Task.FromResult(new[] { "en" });
var defaultCulture = await Task.FromResult("en");
// You can configure the options here as you would do by calling services.Configure<RequestLocalizationOptions>()
var options = new RequestLocalizationOptions()
.AddSupportedCultures(cultures)
.AddSupportedUICultures(cultures)
.SetDefaultCulture(defaultCulture);
// Finally, we instantiate ASP.Net's default RequestLocalizationMiddleware and call it
var defaultImplementation = new RequestLocalizationMiddleware(next, Options.Create(options), loggerFactory);
await defaultImplementation.Invoke(context);
}
}
Then, we inject the required services and use the custom middleware in Startup.cs or Program.cs as follows:
services.AddLocalization()
/* ... */
app.UseMiddleware<CustomRequestLocalizationMiddleware>()
Do not call app.UseRequestLocalization(), because this would call ASP.Net's RequestLocalizationMiddleware again with the default options, and override the culture that has been resolved previously.

Integration testing with in-memory IdentityServer

I have an API that uses IdentityServer4 for token validation.
I want to unit test this API with an in-memory TestServer. I'd like to host the IdentityServer in the in-memory TestServer.
I have managed to create a token from the IdentityServer.
This is how far I've come, but I get an error "Unable to obtain configuration from http://localhost:54100/.well-known/openid-configuration"
The Api uses [Authorize]-attribute with different policies. This is what I want to test.
Can this be done, and what am I doing wrong?
I have tried to look at the source code for IdentityServer4, but have not come across a similar integration test scenario.
protected IntegrationTestBase()
{
var startupAssembly = typeof(Startup).GetTypeInfo().Assembly;
_contentRoot = SolutionPathUtility.GetProjectPath(#"<my project path>", startupAssembly);
Configure(_contentRoot);
var orderApiServerBuilder = new WebHostBuilder()
.UseContentRoot(_contentRoot)
.ConfigureServices(InitializeServices)
.UseStartup<Startup>();
orderApiServerBuilder.Configure(ConfigureApp);
OrderApiTestServer = new TestServer(orderApiServerBuilder);
HttpClient = OrderApiTestServer.CreateClient();
}
private void InitializeServices(IServiceCollection services)
{
var cert = new X509Certificate2(Path.Combine(_contentRoot, "idsvr3test.pfx"), "idsrv3test");
services.AddIdentityServer(options =>
{
options.IssuerUri = "http://localhost:54100";
})
.AddInMemoryClients(Clients.Get())
.AddInMemoryScopes(Scopes.Get())
.AddInMemoryUsers(Users.Get())
.SetSigningCredential(cert);
services.AddAuthorization(options =>
{
options.AddPolicy(OrderApiConstants.StoreIdPolicyName, policy => policy.Requirements.Add(new StoreIdRequirement("storeId")));
});
services.AddSingleton<IPersistedGrantStore, InMemoryPersistedGrantStore>();
services.AddSingleton(_orderManagerMock.Object);
services.AddMvc();
}
private void ConfigureApp(IApplicationBuilder app)
{
app.UseIdentityServer();
JwtSecurityTokenHandler.DefaultInboundClaimTypeMap.Clear();
var options = new IdentityServerAuthenticationOptions
{
Authority = _appsettings.IdentityServerAddress,
RequireHttpsMetadata = false,
ScopeName = _appsettings.IdentityServerScopeName,
AutomaticAuthenticate = false
};
app.UseIdentityServerAuthentication(options);
app.UseMvc();
}
And in my unit-test:
private HttpMessageHandler _handler;
const string TokenEndpoint = "http://localhost/connect/token";
public Test()
{
_handler = OrderApiTestServer.CreateHandler();
}
[Fact]
public async Task LeTest()
{
var accessToken = await GetToken();
HttpClient.SetBearerToken(accessToken);
var httpResponseMessage = await HttpClient.GetAsync("stores/11/orders/asdf"); // Fails on this line
}
private async Task<string> GetToken()
{
var client = new TokenClient(TokenEndpoint, "client", "secret", innerHttpMessageHandler: _handler);
var response = await client.RequestClientCredentialsAsync("TheMOON.OrderApi");
return response.AccessToken;
}
You were on the right track with the code posted in your initial question.
The IdentityServerAuthenticationOptions object has properties to override the default HttpMessageHandlers it uses for back channel communication.
Once you combine this with the CreateHandler() method on your TestServer object you get:
//build identity server here
var idBuilder = new WebBuilderHost();
idBuilder.UseStartup<Startup>();
//...
TestServer identityTestServer = new TestServer(idBuilder);
var identityServerClient = identityTestServer.CreateClient();
var token = //use identityServerClient to get Token from IdentityServer
//build Api TestServer
var options = new IdentityServerAuthenticationOptions()
{
Authority = "http://localhost:5001",
// IMPORTANT PART HERE
JwtBackChannelHandler = identityTestServer.CreateHandler(),
IntrospectionDiscoveryHandler = identityTestServer.CreateHandler(),
IntrospectionBackChannelHandler = identityTestServer.CreateHandler()
};
var apiBuilder = new WebHostBuilder();
apiBuilder.ConfigureServices(c => c.AddSingleton(options));
//build api server here
var apiClient = new TestServer(apiBuilder).CreateClient();
apiClient.SetBearerToken(token);
//proceed with auth testing
This allows the AccessTokenValidation middleware in your Api project to communicate directly with your In-Memory IdentityServer without the need to jump through hoops.
As a side note, for an Api project, I find it useful to add IdentityServerAuthenticationOptions to the services collection in Startup.cs using TryAddSingleton instead of creating it inline:
public void ConfigureServices(IServiceCollection services)
{
services.TryAddSingleton(new IdentityServerAuthenticationOptions
{
Authority = Configuration.IdentityServerAuthority(),
ScopeName = "api1",
ScopeSecret = "secret",
//...,
});
}
public void Configure(IApplicationBuilder app)
{
var options = app.ApplicationServices.GetService<IdentityServerAuthenticationOptions>()
app.UseIdentityServerAuthentication(options);
//...
}
This allows you to register the IdentityServerAuthenticationOptions object in your tests without having to alter the code in the Api project.
I understand there is a need for a more complete answer than what #james-fera posted. I have learned from his answer and made a github project consisting of a test project and API project. The code should be self-explanatory and not hard to understand.
https://github.com/emedbo/identityserver-test-template
The IdentityServerSetup.cs class https://github.com/emedbo/identityserver-test-template/blob/master/tests/API.Tests/Config/IdentityServerSetup.cs can be abstracted away e.g. NuGetted away, leaving the base class IntegrationTestBase.cs
The essences is that can make the test IdentityServer work just like a normal IdentityServer, with users, clients, scopes, passwords etc. I have made the DELETE method [Authorize(Role="admin)] to prove this.
Instead of posting code here, I recommend read #james-fera's post to get the basics then pull my project and run tests.
IdentityServer is such a great tool, and with the ability to use the TestServer framework it gets even better.
I think you probably need to make a test double fake for your authorization middleware depending on how much functionality you want. So basically you want a middleware that does everything that the Authorization middleware does minus the back channel call to the discovery doc.
IdentityServer4.AccessTokenValidation is a wrapper around two middlewares. The JwtBearerAuthentication middleware, and the OAuth2IntrospectionAuthentication middleware. Both of these grab the discovery document over http to use for token validation. Which is a problem if you want to do an in-memory self-contained test.
If you want to go through the trouble you will probably need to make a fake version of app.UseIdentityServerAuthentication that doesnt do the external call that fetches the discovery document. It only populates the HttpContext principal so that your [Authorize] policies can be tested.
Check out how the meat of IdentityServer4.AccessTokenValidation looks here. And follow up with a look at how JwtBearer Middleware looks here
We stepped away from trying to host a mock IdentityServer and used dummy/mock authorizers as suggested by others here.
Here's how we did that in case it's useful:
Created a function which takes a type, creates a test Authentication Middleware and adds it to the DI engine using ConfigureTestServices (so that it's called after the call to Startup.)
internal HttpClient GetImpersonatedClient<T>() where T : AuthenticationHandler<AuthenticationSchemeOptions>
{
var _apiFactory = new WebApplicationFactory<Startup>();
var client = _apiFactory
.WithWebHostBuilder(builder =>
{
builder.ConfigureTestServices(services =>
{
services.AddAuthentication("Test")
.AddScheme<AuthenticationSchemeOptions, T>("Test", options => { });
});
})
.CreateClient(new WebApplicationFactoryClientOptions
{
AllowAutoRedirect = false,
});
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Test");
return client;
}
Then we create what we called 'Impersonators' (AuthenticationHandlers) with the desired roles to mimic users with roles (We actually used this as a base class, and create derived classes based on this to mock different users):
public abstract class FreeUserImpersonator : AuthenticationHandler<AuthenticationSchemeOptions>
{
public Impersonator(
IOptionsMonitor<AuthenticationSchemeOptions> options,
ILoggerFactory logger, UrlEncoder encoder, ISystemClock clock)
: base(options, logger, encoder, clock)
{
base.claims.Add(new Claim(ClaimTypes.Role, "FreeUser"));
}
protected List<Claim> claims = new List<Claim>();
protected override Task<AuthenticateResult> HandleAuthenticateAsync()
{
var identity = new ClaimsIdentity(claims, "Test");
var principal = new ClaimsPrincipal(identity);
var ticket = new AuthenticationTicket(principal, "Test");
var result = AuthenticateResult.Success(ticket);
return Task.FromResult(result);
}
}
Finally, we can perform our integration tests as follows:
// Arrange
HttpClient client = GetImpersonatedClient<FreeUserImpersonator>();
// Act
var response = await client.GetAsync("api/things");
// Assert
Assert.That.IsSuccessful(response);
Test API startup:
public class Startup
{
public static HttpMessageHandler BackChannelHandler { get; set; }
public void Configuration(IAppBuilder app)
{
//accept access tokens from identityserver and require a scope of 'Test'
app.UseIdentityServerBearerTokenAuthentication(new IdentityServerBearerTokenAuthenticationOptions
{
Authority = "https://localhost",
BackchannelHttpHandler = BackChannelHandler,
...
});
...
}
}
Assigning the AuthServer.Handler to TestApi BackChannelHandler in my unit test project:
protected TestServer AuthServer { get; set; }
protected TestServer MockApiServer { get; set; }
protected TestServer TestApiServer { get; set; }
[OneTimeSetUp]
public void Setup()
{
...
AuthServer = TestServer.Create<AuthenticationServer.Startup>();
TestApi.Startup.BackChannelHandler = AuthServer.CreateHandler();
TestApiServer = TestServer.Create<TestApi.Startup>();
}
The trick is to create a handler using the TestServer that is configured to use IdentityServer4. Samples can be found here.
I created a nuget-package available to install and test using the Microsoft.AspNetCore.Mvc.Testing library and the latest version of IdentityServer4 for this purpose.
It encapsulates all the infrastructure code necessary to build an appropriate WebHostBuilder which is then used to create a TestServer by generating the HttpMessageHandler for the HttpClient used internally.
None of the other answers worked for me because they rely on 1) a static field to hold your HttpHandler and 2) the Startup class to have knowledge that it may be given a test handler. I've found the following to work, which I think is a lot cleaner.
First create an object that you can instantiate before your TestHost is created. This is because you won't have the HttpHandler until after the TestHost is created, so you need to use a wrapper.
public class TestHttpMessageHandler : DelegatingHandler
{
private ILogger _logger;
public TestHttpMessageHandler(ILogger logger)
{
_logger = logger;
}
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
_logger.Information($"Sending HTTP message using TestHttpMessageHandler. Uri: '{request.RequestUri.ToString()}'");
if (WrappedMessageHandler == null) throw new Exception("You must set WrappedMessageHandler before TestHttpMessageHandler can be used.");
var method = typeof(HttpMessageHandler).GetMethod("SendAsync", BindingFlags.Instance | BindingFlags.NonPublic);
var result = method.Invoke(this.WrappedMessageHandler, new object[] { request, cancellationToken });
return await (Task<HttpResponseMessage>)result;
}
public HttpMessageHandler WrappedMessageHandler { get; set; }
}
Then
var testMessageHandler = new TestHttpMessageHandler(logger);
var webHostBuilder = new WebHostBuilder()
...
services.PostConfigureAll<JwtBearerOptions>(options =>
{
options.Audience = "http://localhost";
options.Authority = "http://localhost";
options.BackchannelHttpHandler = testMessageHandler;
});
...
var server = new TestServer(webHostBuilder);
var innerHttpMessageHandler = server.CreateHandler();
testMessageHandler.WrappedMessageHandler = innerHttpMessageHandler;

Web api large file download with HttpClient

I have a problem with large file download from the web api to the win forms app. On the win form app I'm using HttpClient for grabbing data. I have following code on server side:
[HttpPost]
[Route]
public async Task<HttpResponseMessage> GetBackup(BackupRequestModel request)
{
HttpResponseMessage response;
try
{
response = await Task.Run<HttpResponseMessage>(() =>
{
var directory = new DirectoryInfo(request.Path);
var files = directory.GetFiles();
var lastCreatedFile = files.OrderByDescending(f => f.CreationTime).FirstOrDefault();
var filestream = lastCreatedFile.OpenRead();
var fileResponse = new HttpResponseMessage(HttpStatusCode.OK);
fileResponse.Content = new StreamContent(filestream);
fileResponse.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
return fileResponse;
});
}
catch (Exception e)
{
logger.Error(e);
response = Request.CreateResponse(HttpStatusCode.InternalServerError);
}
return response;
}
on client side:
private async void btnStart_Click(object sender, EventArgs e)
{
var requestModel = new BackupRequestModel();
requestModel.Username = txtUsername.Text;
requestModel.Password = txtPassword.Text;
requestModel.Path = txtServerPath.Text;
var client = new HttpClient();
var result = await client.PostAsJsonAsync("http://localhost:50116/api/backup", requestModel);
var stream = await result.Content.ReadAsStreamAsync();
var localPath = #"d:\test\filenew.bak";
var fileStream = File.Create(localPath);
stream.CopyTo(fileStream);
fileStream.Close();
stream.Close();
fileStream.Dispose();
stream.Dispose();
client.Dispose();
}
}
This is actually working, but the purpose of this program is to grab large files over 3GB and save it to the client.
I have tried this on files sized 630MB what I notice is: When I call web api with http client, http client actually loads 630MB in the memory stream, and from the memory stream to the file stream, but when I try to load a different file I'm getting OutOfMemoryException. This is happening because the application doesn't release memory from the previous loaded file. I can see in task manager that it is holding 635MB of ram memory.
My question is how can I write data directly from HttpClient to file without using memory stream, or in other words how can I write data to file while HttpClient is downloading data?
To make the request, use a SendAsync overload that allows you to specify a HttpCompletionOption and use ResponseHeadersRead. You'll have to manually build the request though, without using the PostAsJsonAsync convenience method.

Uploading a photo stream from camera into azure blob in WP7

I have the following simple application page that uses the phone camera to upload the taken photo to azure blob:
public partial class AddReport : PhoneApplicationPage
{
// blobs stuff
string storageAccount = "MYACCOUNT";
string storageKey = "MYKEY";
string blobServiceUri = "http://MYACCOUNT.blob.core.windows.net";
CloudBlobClient blobClient;
private Report newReport;
public AddReport()
{
InitializeComponent();
}
protected override void OnNavigatedTo(System.Windows.Navigation.NavigationEventArgs e)
{
//base.OnNavigatedTo(e);
newReport = new Report();
var credentials = new StorageCredentialsAccountAndKey(storageAccount, storageKey);
blobClient = new CloudBlobClient(blobServiceUri, credentials);
}
private void TakePhotoClick(object sender, EventArgs eventArgs)
{
//The camera chooser used to capture a picture.
CameraCaptureTask ctask;
//Create new instance of CameraCaptureClass
ctask = new CameraCaptureTask();
//Create new event handler for capturing a photo
ctask.Completed += new EventHandler<PhotoResult>(ctask_Completed);
//Show the camera.
ctask.Show();
}
void ctask_Completed(object sender, PhotoResult e)
{
if (e.TaskResult == TaskResult.OK && e.ChosenPhoto != null)
{
WriteableBitmap CapturedImage = PictureDecoder.DecodeJpeg(e.ChosenPhoto);
UploadToBlobContainer(e.ChosenPhoto);
}
else
{
//user decided not to take a picture
}
}
private void UploadToBlobContainer(System.IO.Stream stream)
{
string containerName = "reportsPhotos";
var container = blobClient.GetContainerReference(containerName);
container.CreateIfNotExist(true, r =>
Dispatcher.BeginInvoke(() =>
{
var blobName = "report" + newReport.ReportId.ToString();
var blob = container.GetBlobReference(blobName);
blob.Metadata["ReportId"] = newReport.ReportId.ToString();
blob.UploadFromStream(stream, r2 =>
Dispatcher.BeginInvoke(() =>
{
newReport.Photo = container.Uri + "/" + blobName;
}));
}));
}
}
This is a simple case and I am not using SAS to authenticate, instead I save the key in the app itself (this is only for testing purposes) and also my blobs are publicly available.
when I run in debug mode it seems that everything is working, but the photo doesn't get uploaded to the blob. Also, I don't know how I can debug this to see if there was any error from the blob service.
Can anyone tell me what might be wrong ?
EDIT1: it seems that the container is not being created either. i've confirmed this using azure blob explorer
EDIT2: I am getting a System.Net.WebException : "The remote server returned an error: NotFound."
After long hours I have finally discovered that the problem was with this line:
string containerName = "reportsPhotos";
According to here all letters in a container name must be lowercase.
Changing it to reportsphotos solved the issue
That was time well spent.
Can you try just doing it like this instead:
// Retrieve storage account from connection-string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
RoleEnvironment.GetConfigurationSettingValue("StorageConnectionString"));
// Create the blob client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob"
CloudBlob blob = container.GetBlobReference("myblob");
// Create or overwrite the "myblob" blob with contents from a local file
using (var fileStream = System.IO.File.OpenRead(#"path\myfile"))
{
blob.UploadFromStream(fileStream);
}
This is from:
http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/#upload-blob

Apache Camel server app receiving a multipart form POST (file upload)

I'm using Camel servlet component in order to receive xml documents and now I also need to receive files (jpegs, gifs, etc). So here is how my client app is sending a file:
import org.apache.http.HttpResponse;
import org.apache.http.client.HttpClient;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.mime.MultipartEntity;
import org.apache.http.entity.mime.content.FileBody;
import org.apache.http.impl.client.DefaultHttpClient;
public class HttpClientUploadHelper {
public boolean upload(final File file, final String url) {
boolean wasSent = false ;
HttpClient client = new DefaultHttpClient();
HttpPost post = new HttpPost(url);
MultipartEntity entity = new MultipartEntity();
entity.addPart(file.getName(), new FileBody(file));
post.setEntity(entity);
try {
HttpResponse response = client.execute(post);
wasSent = response.getStatusLine().getStatusCode()==200;
} catch (Exception e) {
}
return wasSent;
}
}
my Camel Processor then extracts the HttpServletRequest this way:
HttpServletRequest req = exchange.getIn().getHeader(Exchange.HTTP_SERVLET_REQUEST, HttpServletRequest.class);
then I have this method to finally parse and save the file:
import org.apache.commons.fileupload.FileItemFactory;
import org.apache.commons.fileupload.FileItemIterator;
import org.apache.commons.fileupload.FileItemStream;
import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload;
import org.apache.commons.io.IOUtils
... class declaration, body, etc...
void parseAndSaveFile(final HttpServletRequest req) throws Exception {
// Check that we have a file upload request
boolean isMultipart = ServletFileUpload.isMultipartContent(req);
// Create a factory for disk-based file items
FileItemFactory factory = new DiskFileItemFactory();
// Create a new file upload handler
ServletFileUpload upload = new ServletFileUpload(factory);
// Parse the request
FileItemIterator receivedFiles = upload.getItemIterator(req);
while (receivedFiles.hasNext()) {
FileItemStream file = receivedFiles.next();
if (file.isFormField()) {
System.out.println("WTF?");
} else {
String fileName = file.getName();
File uploadedFile = new File("/home/myuser/" + fileName);
FileOutputStream out = new FileOutputStream(uploadedFile);
IOUtils.copy(file.openStream(), out);
}
}
}
when I use above code within Camel, that isMultipart flag is "true" but that receivedFiles iterator doesn't contains any element. When I use above code within another project with just a plain servlet, the code works. In both ways I'm using jetty as the web container.
So is there any other way to extract the file name and it's content within my camel processor ?
Thanks!
Since you're using Jetty, have you considered using the included MultipartFilter instead of the FileUpload project? Super clean and easy to use.
From the javadoc:
"This class decodes the multipart/form-data stream sent by a HTML form that uses a file input item. Any files sent are stored to a temporary file and a File object added to the request as an attribute. All other values are made available via the normal getParameter API and the setCharacterEncoding mechanism is respected when converting bytes to Strings."
Does this help?
public void process(Exchange exchange) throws Exception {
Message in = exchange.getIn();
Set names = in.getAttachmentNames();
for(String n: names) {
System.out.println("attachment "+n);
DataHandler h = in.getAttachment(n);
if(h!=null) {
try {
Object o = h.getContent();
System.out.println(o);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
if(!names.isEmpty())
return;
}

Resources