What does 'moveFailed' really do? - apache-camel

I want to create a file input that behaves as follows:
Process the exchange
Attempt to copy the input file to a shared drive
If step (2) fails (e.g. share is down) then move to local file instead
Following the doc the 'moveFailed' parameter allows to "set a different target directory when moving files after processing (configured via move defined above) failed". So this sounds like the moveFailed would cover step (3).
The following test, however fails...what am I doing wrong ? I am using camel 2.10.0.fuse.
package sandbox.camel;
import java.io.File;
import org.apache.camel.Endpoint;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.mock.MockEndpoint;
import org.junit.Test;
public class MoveFailedTest extends org.apache.camel.test.junit4.CamelTestSupport {
private String failedDir = "move-failed";
#Override
protected RouteBuilder createRouteBuilder() throws Exception {
return new RouteBuilder() {
#Override
public void configure() throws Exception {
from("file:tmp/prepare").to("file:tmp/input");
from("file:tmp/input?move=/doesnotexist&moveFailed=" + failedDir).to("file:tmp/output");
}
};
}
#Test
public void test_move() throws Exception {
// arrange
File moveFailedDir = new File("tmp/input/" + failedDir);
moveFailedDir.mkdirs();
File[] failedCount1 = moveFailedDir.listFiles();
failedCount1 = failedCount1 == null ? new File[0] : failedCount1;
String messagePayload = "Hello";
Endpoint input = getMandatoryEndpoint("file:tmp/prepare");
MockEndpoint output = getMockEndpoint("mock:file:tmp/output");
output.setMinimumExpectedMessageCount(1);
output.expectedBodiesReceived(messagePayload);
// act
template.asyncSendBody(input, messagePayload);
Thread.sleep(3000);
// assert: only 1 output
assertMockEndpointsSatisfied();
// assert: renamed failed, hence input file was moved to 'movefailed' directory
File[] failedCount2 = moveFailedDir.listFiles();
assertEquals("No file appeared in 'movefailed' directory", failedCount1.length + 1, failedCount2.length);
}
}

Your test is most likely wrong. The autocreate option is default true, which means directories is created if needed.

Related

Integration testing flink job

I've written a small flink application. I has some input, and enriches it with data from an external source. It's an RichAsyncFunction and within the open method I construct a http client to be used for the enrichment.
Now I want to write an integration test for my job. But since the http client is created within the open method I have no means to provide it, and mock it in my integration test. I've tried to refactor it providing it within the constructor, but I'm always getting serialisation errors.
This is the example I'm working from:
https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/stream/operators/asyncio.html
Thanks in advance :)
This question was posted over a year ago but I'll post the answer in-case anyone stumbles upon this in the future.
The serialization exception you are seeing is likely this
Exception encountered when invoking run on a nested suite. *** ABORTED *** (610 milliseconds)
java.lang.NullPointerException:
at java.util.Objects.requireNonNull(Objects.java:203)
at org.apache.flink.streaming.runtime.streamrecord.StreamElementSerializer.<init>(StreamElementSerializer.java:64)
at org.apache.flink.streaming.api.operators.async.AsyncWaitOperator.setup(AsyncWaitOperator.java:136)
at org.apache.flink.streaming.api.operators.SimpleOperatorFactory.createStreamOperator(SimpleOperatorFactory.java:77)
at org.apache.flink.streaming.api.operators.StreamOperatorFactoryUtil.createOperator(StreamOperatorFactoryUtil.java:70)
at org.apache.flink.streaming.util.AbstractStreamOperatorTestHarness.setup(AbstractStreamOperatorTestHarness.java:366)
at org.apache.flink.streaming.util.OneInputStreamOperatorTestHarness.setup(OneInputStreamOperatorTestHarness.java:165)
...
The reason is that your test operator needs to know how to deserialize the DataStream input type. The only way to provide this is by supplying it directly while initializing the testHarness and then passing it to the setup() method call.
So to test the example from the Flink docs you linked you can do something like this (my implementation is in Scala but you can adapt it to Java as well)
import org.apache.flink.api.common.ExecutionConfig
import org.apache.flink.api.java.typeutils.TypeExtractor
import org.apache.flink.configuration.Configuration
import org.apache.flink.streaming.api.datastream.AsyncDataStream.OutputMode
import org.apache.flink.streaming.api.operators.async.AsyncWaitOperator
import org.apache.flink.streaming.runtime.tasks.{StreamTaskActionExecutor, TestProcessingTimeService}
import org.apache.flink.streaming.runtime.tasks.mailbox.{MailboxExecutorImpl, TaskMailboxImpl}
import org.apache.flink.streaming.util.OneInputStreamOperatorTestHarness
import org.scalatest.{BeforeAndAfter, FunSuite, Matchers}
/**
This test case is written using Flink 1.11+.
Older versions likely have a simpler constructor definition for [[AsyncWaitOperator]] so you might have to remove the last two arguments (processingTimeService and mailboxExecutor)
*/
class AsyncDatabaseRequestSuite extends FunSuite with BeforeAndAfter with Matchers {
var testHarness: OneInputStreamOperatorTestHarness[String, (String, String)] = _
val TIMEOUT = 1000
val CAPACITY = 1000
val MAILBOX_PRIORITY = 0
def createTestHarness: Unit = {
val operator = new AsyncWaitOperator[String, (String, String)](
new AsyncDatabaseRequest {
override def open(configuration: Configuration): Unit = {
client = new MockDatabaseClient(host, post, credentials); // put your mock DatabaseClient object here
}
},
TIMEOUT,
CAPACITY,
OutputMode.UNORDERED,
new TestProcessingTimeService,
new MailboxExecutorImpl(
new TaskMailboxImpl,
MAILBOX_PRIORITY,
StreamTaskActionExecutor.IMMEDIATE
)
)
// supply the TypeSerializer for the "input" type of the operator
testHarness = new OneInputStreamOperatorTestHarness[String, (String, String)](
operator,
TypeExtractor.getForClass(classOf[String]).createSerializer(new ExecutionConfig)
)
// supply the TypeSerializer for the "output" type of the operator to the setup() call
testHarness.setup(
TypeExtractor.getForClass(classOf[(String, String)]).createSerializer(new ExecutionConfig)
)
testHarness.open()
}
before {
createTestHarness
}
after {
testHarness.close()
}
test("Your test case goes here") {
// fill in your test case here
}
}
Here is the solution in Java
class TestingClass {
#InjectMocks
ClassUnderTest cut;
private static OneInputStreamOperatorTestHarness<IN, OUT> testHarness; // replace IN, OUT with your asyncFunction's
private static long TIMEOUT = 1000;
private static int CAPACITY = 1000;
private static int MAILBOX_PRIORITY = 0;
private long UNUSED_TIME = 0L;
Driver driverRef;
public void createTestHarness() throws Exception {
cut = new ClassUnderTest() {
#Override
public void open(Configuration parameters) throws Exception {
driver = mock(Driver.class); // mock your driver (external data source here).
driverRef = driver; // create external ref to driver to refer to in test
}
};
MailboxExecutorImpl mailboxExecutorImpl = new MailboxExecutorImpl(
new TaskMailboxImpl(), MAILBOX_PRIORITY, StreamTaskActionExecutor.IMMEDIATE
);
AsyncWaitOperator operator = new AsyncWaitOperator<>(
gatewayEnrichment,
TIMEOUT,
CAPACITY,
ORDERED,
new TestProcessingTimeService(),
mailboxExecutorImpl
);
testHarness = new OneInputStreamOperatorTestHarness<IN, OUT>(
operator,
TypeExtractor.getForClass(IN.class).createSerializer(new ExecutionConfig())
);
testHarness.setup(TypeExtractor.getForClass(OUT.class).createSerializer(new ExecutionConfig()));
testHarness.open();
}
#BeforeEach()
void setUp() throws Exception {
createTestHarness();
MockitoAnnotations.openMocks(this);
}
#AfterEach
void tearDown() throws Exception {
testHarness.close();
}
#Test
public void test_yourTestCase() throws Exception {
}
}

Hadoop Map Reduce - Read HDFS File - FileAlreadyExists error

I am new to Hadoop. I am trying to read an existing file on HDFS using the below code. The configuration seem file and the file path is correct as well. -
public static class Map extends Mapper<LongWritable, Text, Text, Text> {
private static Text f1, f2, hdfsfilepath;
private static HashMap<String, ArrayList<String>> friendsData = new HashMap<>();
public void setup(Context context) throws IOException {
Configuration conf = context.getConfiguration();
Path path = new Path("hdfs://cshadoop1" + conf.get("hdfsfilepath"));
FileSystem fs = FileSystem.get(path.toUri(), conf);
if (fs.exists(path)) {
BufferedReader br = new BufferedReader(
new InputStreamReader(fs.open(path)));
String line;
line = br.readLine();
while (line != null) {
StringTokenizer str = new StringTokenizer(line, ",");
String friend = str.nextToken();
ArrayList<String> friendDetails = new ArrayList<>();
while (str.hasMoreTokens()) {
friendDetails.add(str.nextToken());
}
friendsData.put(friend, friendDetails);
}
}
}
public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
for (String k : friendsData.keySet()) {
context.write(new Text(k), new Text(friendsData.get(k).toString()));
}
}
}
I am getting the below exception when I run the code -
Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://cshadoop1/socNetData/userdata/userdata.txt already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
I am just trying to read an existing file. Any ideas what I am missing here? Appreciate any help.
Exception tells you that your output directory already exists but it should not. Delete it or change its name.
Moreover the name of your output directory 'userdata.txt' looks like the name of a file. So check you are not mistaken in your input/output directories.

Error in Native Interface in codenameone

I have added paypal android SDK under native/android package.Created native interface in main project structure(com.mycompany.myapp).Under native/android the implemented class is using the paypal sdk classes.
My implemented class:
package com.mycompany.myapp;
import com.paypal.android.sdk.payments.PayPalConfiguration;
import com.paypal.android.sdk.payments.PayPalPayment;
import com.paypal.android.sdk.payments.PaymentActivity;
import android.content.Intent;
import android.net.Uri;
import android.app.Activity;
import com.codename1.impl.android.AndroidNativeUtil;
import com.codename1.impl.android.CodenameOneActivity;
import java.math.BigDecimal;
public class MyNativeImpl {
// private static final String TAG = "paymentdemoblog";
/**
* - Set to PaymentActivity.ENVIRONMENT_PRODUCTION to move real money.
*
* - Set to PaymentActivity.ENVIRONMENT_SANDBOX to use your test credentials
* from https://developer.paypal.com
*
* - Set to PayPalConfiguration.ENVIRONMENT_NO_NETWORK to kick the tires
* without communicating to PayPal's servers.
*/
// private static final String CONFIG_ENVIRONMENT =
// PayPalConfiguration.ENVIRONMENT_NO_NETWORK;
private static final String CONFIG_ENVIRONMENT = PayPalConfiguration.ENVIRONMENT_SANDBOX;
// note that these credentials will differ between live & sandbox
// environments.
private static final String CONFIG_CLIENT_ID = "Aeqc2X1rBIEUtDNqsaRNr0h1neFo9QnNmfgmpA3D32uSLaHpGJu9NV1KfMnFmy7O-_hV47I7ST0SXDW2";
private static final int REQUEST_CODE_PAYMENT = 1;
private static final int REQUEST_CODE_FUTURE_PAYMENT = 2;
private static PayPalConfiguration config = new PayPalConfiguration()
.environment(CONFIG_ENVIRONMENT)
.clientId(CONFIG_CLIENT_ID)
// The following are only used in PayPalFuturePaymentActivity.
.merchantName("Hipster Store")
.merchantPrivacyPolicyUri(
Uri.parse("https://www.example.com/privacy"))
.merchantUserAgreementUri(
Uri.parse("https://www.example.com/legal"));
PayPalPayment thingToBuy;
private static Activity activity() {
return com.codename1.impl.android.AndroidNativeUtil.getActivity();
}
public String payPalTest() {
//Activity activity = AndroidNativeUtil.getActivity();
thingToBuy = new PayPalPayment(new BigDecimal("10"), "USD",
"HeadSet", PayPalPayment.PAYMENT_INTENT_SALE);
Intent intent = new Intent(activity(),PaymentActivity.class);
intent.putExtra(PaymentActivity.EXTRA_PAYMENT, thingToBuy);
activity().startActivityForResult(intent, REQUEST_CODE_PAYMENT);
return "test";
}
public boolean isSupported() {
return false;
}
}
I called the method from main class:
MyNative my = (MyNative)NativeLookup.create(MyNative.class);
if(my!= null){
String aa =my.payPalTest();
System.out.println("result::" + aa);
System.out.println("paypalInt" + my.toString());
}
the apk build successfully but getting below error while trigger the code:
android.content.ActivityNotFound
Exception:Unable to find explicit activity class{com.mycompany.myapp/com.paypal.android.sdk.paymentActivity....
It is searching the paypal sdk classes under main project folder structure.Do I need to add the SDK jar under the said structure?
What I need to do to fix the issue.
The code looks fine, I am guessing this is something in the configuration.
Unable to find explicit activity class Payment activity with PayPal SDK in Xamarin on Android

Tomcat executor with runnable while(true) loop is only run once. Why?

I am trying to implement a javax.mail.event.MessageCountListener in Tomcat. When I start the application the contextInitialized method seems to run and the mailbox is read. However, I see the log message "Idling" only once. I would expect that it would idle constantly and invoke the AnalyzerService() when an email is received or deleted.
Update: Found that the idle() method is not returning. It runs untill com.sun.mail.iap.ResponseInputStream.readResponse(ByteArray ba) method where it runs into a while loop where it never gets out.
Am I misusing the idle() method for something I should not do? Is this a bug in com.sun.mail.iap package?
The AnalyzerContextListener.java:
import com.sun.mail.imap.IMAPStore;
import java.util.Properties;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import javax.mail.Folder;
import javax.mail.MessagingException;
import javax.mail.Session;
import javax.mail.event.MessageCountListener;
import javax.servlet.ServletContext;
import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;
public class AnalyzerContextListener implements ServletContextListener {
private ExecutorService executorService;
private final String username = "myemail#gmail.com";
private final String password = "mypassword";
private final String mailhost = "imap.gmail.com";
private final String foldername = "INBOX";
#Override
public void contextInitialized(ServletContextEvent sce) {
final ServletContext servletContext = sce.getServletContext();
executorService = Executors.newFixedThreadPool(3);
Session session = Session.getInstance(new Properties());
try {
final IMAPStore store = (IMAPStore) session.getStore("imaps");
store.connect(mailhost, username, password);
final Folder folder = store.getFolder(foldername);
if (folder == null) {
servletContext.log("Folder in mailbox bestaat niet.");
return;
}
folder.open(Folder.READ_ONLY);
MessageCountListener countListener = new AnalyzerService();
folder.addMessageCountListener(countListener);
Runnable runnable = new Runnable() {
#Override
public void run() {
while (true) {
try {
servletContext.log("Aantal berichten in folder: " + folder.getMessageCount());
servletContext.log("Idling");
store.idle();
} catch (MessagingException ex) {
servletContext.log(ex.getMessage());
return;
}
}
}
};
executorService.execute(runnable);
servletContext.log("Executorservice gestart");
} catch (MessagingException ex) {
servletContext.log(ex.getMessage());
}
}
#Override
public void contextDestroyed(ServletContextEvent sce) {
sce.getServletContext().log("Context wordt vernietigd");
executorService.shutdown();
sce.getServletContext().log("Executorservice gestopt");
}
}
The AnalyzerService.java:
import javax.mail.Message;
import javax.mail.MessagingException;
import javax.mail.event.MessageCountEvent;
import javax.mail.event.MessageCountListener;
class AnalyzerService implements MessageCountListener {
public AnalyzerService() {
}
#Override
public void messagesAdded(MessageCountEvent event) {
Message[] addedMessages = event.getMessages();
for (Message message : addedMessages) {
try {
System.out.println(message.getSubject());
} catch (MessagingException ex) {
System.out.println(ex.getMessage());
}
}
}
#Override
public void messagesRemoved(MessageCountEvent event) {
Message[] removedMessages = event.getMessages();
for (Message message : removedMessages) {
try {
System.out.println(message.getSubject());
} catch (MessagingException ex) {
System.out.println(ex.getMessage());
}
}
}
}
while (true) {
try {
servletContext.log("Aantal berichten in folder: " + folder.getMessageCount());
servletContext.log("Idling");
store.idle();
} catch (MessagingException ex) {
servletContext.log(ex.getMessage());
return;
}
}
has exactly 2 3 possibilities to end earlier than never run only once.
The loop actually ends either:
Through the explicit return in case of a MessagingException. Look at your logs, there either a message or something strange like "null". Consider using a proper stacktrace log (.log(String message, Throwable throwable)) since Exception#getMessage() is often empty or not telling you much.
Through any unchecked exception. You should notice that in some log though since uncaught exceptions via executorService.execute should invoke the nearest uncaught exeption handler which is generally bad. See Choose between ExecutorService's submit and ExecutorService's execute
The loop stops executing after it logs "Idling"
store.idle() never returns. (every other line of code could do that theoretically as well, e.g. the folder.getMessageCount() call in a 2nd iteration but that's very unlikely)
Regarding No 3 - the documentation
Use the IMAP IDLE command (see RFC 2177), if supported by the server, to enter idle mode so that the server can send unsolicited notifications without the need for the client to constantly poll the server. Use a ConnectionListener to be notified of events. When another thread (e.g., the listener thread) needs to issue an IMAP comand for this Store, the idle mode will be terminated and this method will return. Typically the caller will invoke this method in a loop.
If the mail.imap.enableimapevents property is set, notifications received while the IDLE command is active will be delivered to ConnectionListeners as events with a type of IMAPStore.RESPONSE. The event's message will be the raw IMAP response string. Note that most IMAP servers will not deliver any events when using the IDLE command on a connection with no mailbox selected (i.e., this method). In most cases you'll want to use the idle method on IMAPFolder.
That sounds like this method is not designed to return any time soon. In your case never since you don't issue any commands towards the server after you enter idle. Besides that
folder.idle() could be what you should actually do
I guess the documentation is wrong, however ConnectionListener and MessageCountListener are two different things.

Cannot Find Symbol for another class file

I've had this problem a few times, where I've created another class file and the main class file can't find it.
Here's the main class file:
package textfiles;
import java.io.IOException;
public class FileData
{
public static void main(String[] args)
{
String file_name = "Lines.txt";
try {
ReadFile file = new ReadFile(file_name);
String[] aryLines = file.OpenFile();
for(int i =0; i<aryLines.length; i++)
{
System.out.println(aryLines);
}
}
catch(IOException e)
{
System.out.println(e.getMessage());
}
}
}
Here is the class file it can't find:
package textfiles;
import java.io.IOException;
import java.io.FileReader;
import java.io.BufferedReader;
public class ReadFile
{
private String path;
int numberOfLines=0;
public ReadFile(String file_path)
{
path = file_path;
}
public String[] OpenFile() throws IOException
{
FileReader fr = new FileReader(path);
BufferedReader br = new BufferedReader(fr);
int numberOfLines = readLines();
String[] textData = new String[numberOfLines];
for(int i=0; i<numberOfLines; i++)
{
textData[i] = br.readLine();
}
br.close();
return textData;
}
int readLines() throws IOException
{
FileReader file_to_read = new FileReader(path);
BufferedReader bf = new BufferedReader(file_to_read);
String aLine;
while((aLine = bf.readLine()) != null)
{
numberOfLines++;
}
bf.close();
return numberOfLines;
}
}
I've tried running javac textfiles\ReadFile.java and javac textfiles\FileData.java as a suggestion for this. That doesn't work. I've made sure I have compiled ReadFile and fixed all the errors there.
The compiler error I get is:
C:\Users\Liloka\Source>javac FileData.java
FileData.java:13: cannot find symbol
symbol : class ReadFile
location: class textfiles.FileData
ReadFile file = new ReadFile(file_name);
^
FileData.java:13: cannot find symbol
symbol : class ReadFile
location: class textfiles.FileData
ReadFile file = new ReadFile(file_name);
^
2 errors
I'm using notepad++and .cmd so it can't be an IDE error.
Thanks in advance!
Make sure the java files are all in the textfiles directory:
textfiles/FileData.java
textfiles/ReadFile.java
And run:
javac textfiles/FileData.java textfiles/ReadFile.java
java textfiles.FileData
Your code works without any modification. I think you are compiling from a wrong directory:
C:\Users\Liloka\Source>javac FileData.java
Move the FileData.java to the textfiles directory.
You have to compile all the java files used by your main class. As ReadFile is used by FileData you have to compile it too.
Did you tried
javac Filedata.java ReadFile.java
or
javac *.java
?
There must be a conflict with generated classes.
Just try to remove all the classes that have been generated and build project again.

Resources