copy related file with apache camel - apache-camel

First of all: i'm a camel newbie :-)
I want to watch a directory for xml files, then i want to move that xml file to another directory and move a pdf file with the same filename (but other extention) to the same directory, and then do some java stuff.
What is the best way to move that pdf file?
This is the route that i currently have:
from("file://C:/temp/camel/in?delete=true").filter(new Predicate() {
#Override
public boolean matches(final Exchange exchange) {
String filename = (String) exchange.getIn().getHeader("CamelFileRelativePath");
return "xml".equals(FilenameUtils.getExtension(filename));
}
})
.to("file://C:/temp/camel/out").bean(ServiceBean.class, "callWebservice")
Thanks!

You can achieve that without using a filter but a regular expression to filter only the 2 extensions, .xml and .pdf
from("file://C:/temp/camel/in?delete=true&?include=.*.xml|.*.zip")
.to("file://C:/temp/camel/out").bean(ServiceBean.class, "callWebservice");
If you use filter it will delete the files that you are not interested in, which might not be what you want, this solution will just leave them in that directory

Related

Camel for multiple files processing

I am a new at Camel. I am going to have a file processing with camel but I haven't found a ready solution for my case. I have to process multiple files together in case they exist. These files are uploaded to specific folder with some delays(Example: we have two files A.csv and B.csv, and A.csv is uploaded 10 sec later than B.csv and vice versa). Also if one file is absent more than specific time I need to process only a one file. Could anybody help me with choice a pattern ? As I understand I can use the camel filter to be sure that we already have these two files A.csv and B.csv and only then start processing, but it doesn't resolve my problem.
This is Aggregator EIP.
from("file:inputFolder")
.aggregate(constant(true), AggregationStrategies.groupedExchange())
.completionSize(2) //Wait for two files
.completionTimeout(60000) //Or process single file, if completionSize was not fulfilled within one minute
.to("log:do_something") //Here you can access List<Exchange> from message body
To group messages you can use correlation Expression. For your example (group messages by filename prefix before _) it could be something like this:
private final Expression CORRELATION_EXPRESSION = new Expression() {
#Override
public <T> T evaluate(Exchange exchange, Class<T> type) {
final String fileName = exchange.getIn().getHeader(Exchange.FILE_NAME, String.class);
final String correlationExpression = fileName.substring(0, fileName.indexOf('_'));
return exchange.getContext().getTypeConverter().convertTo(
type,
correlationExpression
);
}
};
And pass it to Aggregator:
from("file:inputDirectory")
.aggregate(CORRELATION_EXPRESSION, AggregationStrategies.groupedExchange())
...
See this gist for full example https://gist.github.com/bedlaj/a2a56aa9291bced8c0a8edebacaf22b0

Dynamic Apache Camel Output Route

Hi i want to compute a dynamic output route using apache Camel. I receive a bunch of files in a folder location, based on its contents i want to move the file to dynamic output folder. The name of the ouput folder will be constructed based on the input content of the file. How do i acheive it.
The Following piece of code read the files, processes them, but i am not sure how to set the value of ${foldername} based on the contents of the file
from("file:D:\\camel\\input\\one?recursive=true&delete=true")
.process(new LogProcessor())
.to("file:D:\\camel\\output\\${foldername}")
Please assist
You could create a custom processor to construct the foldername and insert into a header.
public class DirectoryNameProcessor implements Processor {
#Override
public void process(Exchange exchange) {
Message in = exchange.getIn();
// Get the contents of the processed file
String body = in.getBody(String.class);
//Get the original file name
String fileName = in.getHeader("CamelFileName", String.class);
// Perform your logic
in.setHeader("foldername");
}
}
Then in your route you could access the newly created foldername-header:
.to("file:D:\\camel\\output\\${header.foldername}");
The short answer is, you can use the dynamic to endpoint toD.
http://camel.apache.org/message-endpoint.html#MessageEndpoint-DynamicTo
It would look like:
from("file:D:\\camel\\input\\one?recursive=true&delete=true")
.process(new LogProcessor())
.toD("file:D:\\camel\\output\\${foldername}")

How can Apache Camel be used to monitor file changes?

I would like to monitor all of the files in a given directory for changes, ie an updated timestamp. This use case seems natural for Camel using the file component, but I can't seem to find a way to configure this behavior.
A uri like:
file:/some/directory
will consume the files in the provided directory but will delete them.
A uri like:
file:/some/directory?noop=true
consumes each file once when it is added or when the route is started.
It's surprising that there isn't an option along the lines of
consumeOnChange=true
Is there a straightforward way to monitor file changes and not delete the file after consuming?
You can do this by setting up the idempotentKey to tell Camel how a file is considered changed. For example if the file size changes, or its timestamp changes etc.
See more details at the Camel file documentation at: https://camel.apache.org/components/latest/file-component.html
See the section Avoiding reading the same file more than once (idempotent consumer). And read about idempotent and idempotentKey.
So something alike
from("file:/somedir?noop=true&idempotentKey=${file:name}-${file:size}")
Or
from("file:/somedir?noop=true&idempotentKey=${file:name}-${file:modified}")
You can read here about the various ${file:xxx} tokens you can use: http://camel.apache.org/file-language.html
Setting noop to true will result in Camel setting idempotent=true as well, despite the fact that idempotent is false by default.
Simplest solution to monitor files would be:
.from("file:path?noop=true&idempotent=false&delay=60s")
This will monitor changes to all files in the given directory every one minute.
This can be found in the Camel documentation at: http://camel.apache.org/file2.html.
I don't think Camel supports that specific feature but with the existent options you can come up with a similar solution of monitoring a directory.
What you need to do is set a small delay value to check the directory and maintain a repository of the already read files. Depending on how you configure the repository (by size, by filename, by a mix of them...) this solution would be able to provide you information about news files and modified files. As a caveat it would be consuming the files in the directory very often.
Maybe you could use other solutions different from Camel like Apache Commons VFS2 (I wrote a explanation about how to use it for this scenario: WatchService locks some files?
I faced the same problem i.e. wanted to copy updated files also (along with new files). Below is my configuration,
public static void main(String[] a) throws Exception {
CamelContext cc = new DefaultCamelContext();
cc.addRoutes(createRouteBuilder());
cc.start();
Thread.sleep(10 * 60 * 1000);
cc.stop();
}
protected static RouteBuilder createRouteBuilder() {
return new RouteBuilder() {
public void configure() {
from("file://D:/Production"
+ "?idempotent=true"
+ "&idempotentKey=${file:name}-${file:size}"
+ "&include=.*.log"
+ "&noop=true"
+ "&readLock=changed")
.to("file://D:/LogRepository");
}
};
}
My testing steps:
Run the program and it copies few .log files from D:/Production to D:/LogRepository and then continues to poll D:/Production directory
I opened a already copied log say A.log from D:/Production (since noop=true nothing is moved) and edited it with some editor tool. This doubled the file size and save it.
At this point I think Camel is supposed to copy that particular file again since its size is modified and in my route definition I used "idempotent=true&idempotentKey=${file:name}-${file:size}&readLock=changed". But camel ignores the file.
When I use TRACE for logging it says "Skipping as file is already in progress...", but I did not find any lock file in D:/Production directory when I editted and saved the file.
I also checked that camel still ignores the file if I replace A.log (with same name but bigger size) in D:/Production directory from outside.
But I found, everything is working as expected if I remove noop=true option.
Am I missing something?
If you want monitor file changes in camel, use file-watch component.
Example -> RECURSIVE WATCH ALL EVENTS (FILE CREATION, FILE DELETION, FILE MODIFICATION):
from("file-watch://some-directory")
.log("File event: ${header.CamelFileEventType} occurred on file ${header.CamelFileName} at ${header.CamelFileLastModified}");
You can see the complete documentation here:
Camel file-watch component

Trying to access web/uploads/produits/img Symfony2

I'm trying to upload a file and place it in my web/uploads/produits/img directory but the code bellow is not working:
public function getUploadDir()
{
return 'uploads/produits/img';
}
protected function getUploadRootDir()
{
return __DIR__.'/../../../../web/'.$this->getUploadDir();
}
I get the folowing error:
Could not move the file "C:\wamp\tmp\php9265.tmp" to "C:\wamp\www\Projet\src\Arkiglass\ProduitBundle/../../../..\web/uploads/produits/img\." (move_uploaded_file()
[function.move-uploaded-file]: Unable to move 'C:\wamp\tmp\php9265.tmp' to 'C:\wamp\www\Projet\src\Arkiglass\ProduitBundle/../../../..\web/uploads/produits/img\.')
It's seems like it doesn't know the directory __DIR__.'/../../../../web/'...
It doesn't seem your entity lives under the entity directory, but rather directly under the bundles dir. Does it? So you're going up one dir to much. Two options:
Move your entity in the Entity subfolder, adjust namespace and all references. Standard symfony bundle dir layout.
remove one level of ../

Create a directory dynamically inside the "web pages" folder of a java web application

So I'm trying to dynamically create a folder inside the web pages folder.
I'm making a game database. Everytime a game is added I do this:
public void addGame(Game game) throws DatabaseException {
em.getTransaction().begin();
em.persist(game);
em.getTransaction().commit();
File file = new File("C:\\GameDatabaseTestFolder");
file.mkdir();
}
So everything works here.
The file get's created.
But I want to create the folder like this:
public void addGame(Game game) throws DatabaseException {
em.getTransaction().begin();
em.persist(game);
em.getTransaction().commit();
File file = new File(game.getId()+"/screenshots");
file.mkdir();
}
Or something like that. So it will be created where my jsp files are and it will have the id off the game.
I don't understand where the folder is created by default.
thank you in advance,
David
It's by default relative to the "current working directory", i.e. the directory which is currently open at the moment the Java Runtime Environment has started the server. That may be for example /path/to/tomcat/bin, or /path/to/eclipse/workspace/project, etc, depending on how the server is started.
You should now realize that this condition is not controllable from inside the web application.
You also don't want to store it in the expanded WAR folder (there where your JSPs are), because any changes will get lost whenever you redeploy the WAR (with the very simple reason that those files are not contained in the original WAR).
Rather use an absolute path instead. E.g.
String gameWorkFolder = "/path/to/game/work/folder";
new File(gameWorkFolder, game.getId()+"/screenshots");
You can make it configureable by supplying it as a properties file setting or a VM argument.
See also:
Image Upload and Display in JSP
getResourceAsStream() vs FileInputStream

Resources