Increasing heap by excessive use oft Java ScriptEngine (Jyhton) - heap-memory

We have a JavaEE application that uses jython to execute some python scripts. By and by the used heapspace gets bigger and bigger until there is no more heapspace left. In a heapdump i can se that there are a lot of Py*-classes.
So i wrote a small test-program:
TestApp
public class TestApp {
private final ScriptEngineManager scriptEngineManager = new ScriptEngineManager();
private HashMap<String, ScriptEngine> scriptEngines = new HashMap<String, ScriptEngine>();
private final String scriptContainerPath = "";
public static void main(String[] args) throws InterruptedException {
int counter = 1;
while(true) {
System.out.println("iteration: " + counter);
TestApp testApp = new TestApp();
testApp.execute();
counter++;
Thread.sleep(100);
}
}
void execute() {
File scriptContainer = new File(scriptContainerPath);
File[] scripts = scriptContainer.listFiles();
if (scripts != null && scripts.length > 0) {
Arrays.sort(scripts, new Comparator<File>() {
#Override
public int compare(File file1, File file2) {
return file1.getName().compareTo(file2.getName());
}
});
for (File script : scripts) {
String engineName = ScriptExecutor.getEngineNameByExtension(script.getName());
if(!scriptEngines.containsKey(engineName)) {
scriptEngines.put(engineName, scriptEngineManager.getEngineByName(engineName));
}
ScriptEngine scriptEngine = scriptEngines.get(engineName);
try {
ScriptExecutor scriptExecutor = new ScriptExecutor(scriptEngine, script, null);
Boolean disqualify = scriptExecutor.getBooleanScriptValue("disqualify");
String reason = scriptExecutor.getStringScriptValue("reason");
System.out.println("disqualify: " + disqualify);
System.out.println("reason: " + reason);
} catch (Exception e) {
e.printStackTrace();
}
}
// cleanup
for(Map.Entry<String, ScriptEngine> entry : scriptEngines.entrySet()) {
ScriptEngine engine = entry.getValue();
engine.getContext().setErrorWriter(null);
engine.getContext().setReader(null);
engine.getContext().setWriter(null);
}
}
}
}
ScriptExecutor
public class ScriptExecutor {
private final static String pythonExtension = "py";
private final static String pythonEngine = "python";
private final ScriptEngine scriptEngine;
public ScriptExecutor(ScriptEngine se, File file, Map<String, Object> keyValues) throws FileNotFoundException, ScriptException {
scriptEngine = se;
if (keyValues != null) {
for (Map.Entry<String, Object> entry : keyValues.entrySet()) {
scriptEngine.put(entry.getKey(), entry.getValue());
}
}
// execute script
Reader reader = null;
try {
reader = new FileReader(file);
scriptEngine.eval(reader);
} finally {
if (reader != null) {
try {
reader.close();
} catch (IOException e) {
// nothing to do
}
}
}
}
public Boolean getBooleanScriptValue(String key) {
// convert Object to Boolean
}
public String getStringScriptValue(String key) {
// convert Object to String
}
public static String getEngineNameByExtension(String fileName) {
String extension = fileName.substring(fileName.lastIndexOf(".") + 1);
if (pythonExtension.equalsIgnoreCase(extension)) {
System.out.println("Found engine " + pythonEngine + " for extension " + extension + ".");
return pythonEngine;
}
throw new RuntimeException("No suitable engine found for extension " + extension);
}
}
In the specified directory are 14 python scripts that all look like this:
disqualify = True
reason = "reason"
I start this program with the following VM-arguments:
-Xrs -Xms16M -Xmx16M -XX:MaxPermSize=32M -XX:NewRatio=3 -Dsun.rmi.dgc.client.gcInterval=300000 -Dsun.rmi.dgc.server.gcInterval=300000 -XX:+UseConcMarkSweepGC -XX:+UseParNewGC -XX:+CMSParallelRemarkEnabled -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -server
These are the arguments our AppServer is running with. Only Xms, Xmx and MaxPermSize are smaller in my testcase.
When I run this application I can see that the CMS Old Gen pool increases to its max size. After that the Par Eden Space pool increases. In addition at any time the ParNewGC does not run anymore. The cleanup part improved the situation but didn't resolve the problem. Has anybody an idea why my heap isn't completly cleaned?

I think I have found a solution for my problem: I removed the JSR223 stuff und now use the PythonInterpreter directly.

Related

Extent report version 4 - Create two extent reports instead of one html report for all extectued testcases

I am using extent reportversion 4 and want one .html report after executing of all the testcases but it creates two html reports for executing the 3 methods in testclass
In the testclass, I have writteh code like that #beforemethod will execute before executing each testcase, followed by executing the testcase & in #aftermethod it will flush the repot to generate Html report and afterthat using #afterclass annotations to quit the driver**
**Testclass:**
public class HomePageTest extends BaseClass {
HomePage homePage;
public HomePageTest() {
super();
}
#BeforeMethod
#Parameters({ "platformName", "url", "udid" })
public void setUpHomePageClass(String platformName, String url, String udid) throws Exception {
try {
BaseClass baseClass = new BaseClass();
baseClass.initialize_driver(platformName, url, udid);
homePage = new HomePage(driver);
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
#BeforeMethod
#Parameters({ "platformName", "url", "udid" })
public void setUpHomePageClass(String platformName, String url, String udid) throws Exception {
try {
BaseClass baseClass = new BaseClass();
baseClass.initialize_driver(platformName, url, udid);
homePage = new HomePage(driver);
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Test(priority = 1, description = "Verify element i.e Top50 Txt on homepage test")
#Severity(SeverityLevel.NORMAL)
#Description("TestCase Description: Verify element i.e Top50 Txt on homepage")
public void verifyeElementsOnHomePageTest() throws Exception {
log.info("***Executing verifyElementsOnHomeScreenTest***");
logger = extent.createTest("Verify the elements on HomePage after redirecting to the splash screen");
log.info("wait for continue_button to be clickable");
TestUtil.waitForElementToBeClickable(By.id("continue_button"));
homePage.clickContinueBtnAfterSplashScreen();
log.info("Clicked on continue_button");
log.info("waitForUserNameToBeClickable - username");
boolean flag = homePage.validateTop50Txt();
Assert.assertTrue(flag);
log.info("Top50Txt isDisplayed");
log.info("verifyElementsonHomeScreenTest Ended");
}
#Test(priority = 2, description = "Swipe to next video test")
#Severity(SeverityLevel.NORMAL)
#Description("TestCase Description: Swipe from one video to another")
public void swipeToNxtVideoTest() throws InterruptedException {
try {
logger = extent.createTest("Swipe from one video to another & get the username ");
log.info("***Executing swipeToNxtVideoTest***");
log.info("waitForElementToPresenceOfElementLocated - username");
TestUtil.waitForElementToPresenceOfElementLocated(By.id("user_name"));
log.info("swipeverticalDown for nxt video");
TestUtil.swipeverticalDown();
log.info("swipeToNxtVideoTest Ended");
} catch (Exception e) {
e.printStackTrace();
log.error("Found Exception - swipeToNxtVideoTest");
}
}
/*
* #Test(priority = 3, retryAnalyzer =
* com.automation.listeners.RetryAnalyzer.class ) public void checkFailure() {
* Assert.assertEquals(true, false); System.out.println("failed");
*
* }
*/
#AfterMethod
public void getResult(ITestResult result) throws Exception {
if (result.getStatus() == ITestResult.FAILURE) {
logger.log(Status.FAIL,
MarkupHelper.createLabel(result.getName() + " - Test Case Failed", ExtentColor.RED));
logger.log(Status.FAIL,
MarkupHelper.createLabel(result.getThrowable() + " - Test Case Failed", ExtentColor.RED));
String screenshotPath = TestUtil.captureScreenAsBase64(driver, result.getName());
logger.fail("Snapshot below: " + logger.addScreenCaptureFromPath(screenshotPath));
} else if (result.getStatus() == ITestResult.SKIP) {
logger.log(Status.SKIP,
MarkupHelper.createLabel(result.getName() + " - Test Case Skipped", ExtentColor.ORANGE));
} else if (result.getStatus() == ITestResult.SUCCESS) {
logger.log(Status.PASS,
MarkupHelper.createLabel(result.getName() + " Test Case PASSED", ExtentColor.GREEN));
}
extent.flush();
}
#AfterClass
public void quitDriver() {
getDriver().quit();
}
Please do let me know where I have been lacking in code; I might have a intuitions that there is an issue in testng annotations in my code
Base Class:
DesiredCapabilities capabilities = new DesiredCapabilities();
public void setDriver(AppiumDriver<MobileElement> driver) {
tdriver.set(driver);
}
public static synchronized AppiumDriver<MobileElement> getDriver() {
return tdriver.get();
}
public BaseClass() {
try {
prop = new Properties();
FileInputStream ip = new FileInputStream(
System.getProperty("user.dir") + "/src/main/java/com/automation/config/config.properties");
prop.load(ip);
// extend reports
Date date = new Date();
SimpleDateFormat dateFormatFolder = new SimpleDateFormat("dd_MMM_yyyy");
File ResultDir = new File(System.getProperty("user.dir") + File.separator + "/FrameworkReports/"
+ dateFormatFolder.format(date));
// Defining Directory/Folder Name
if (!ResultDir.exists()) { // Checks that Directory/Folder Doesn't Exists!
ResultDir.mkdir();
}
SimpleDateFormat dateFormat = new SimpleDateFormat("dd_MMM_yyyy_hh_mm_ssaa");
htmlReporter = new ExtentHtmlReporter(
ResultDir + "/" + "Report" + " " + dateFormat.format(date) + " .html");
htmlReporter.config().setDocumentTitle("Automation Report");
htmlReporter.config().setReportName("YOVO AUTOMATION");
htmlReporter.config().setTheme(Theme.DARK);
extent = new ExtentReports();
extent.attachReporter(htmlReporter);
extent.setSystemInfo("Host Name", "localhost");
extent.setSystemInfo("Environment", "Windows 7");
extent.setSystemInfo("User Name", "Abhishek Chauhan");
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public void initialize_driver(String platformName, String url, String udid) throws Exception {
log = LogManager.getLogger(BaseClass.class);
BasicConfigurator.configure();
File appDir = new File("/src/main/resources/apk");
File app = new File(appDir, "yovoapp-release.apk");
mDirpath = System.getProperty("user.dir");
mApkfilepath = mDirpath + "/app/yovoapp-release.apk";
capabilities.setCapability(MobileCapabilityType.PLATFORM_NAME, platformName);
capabilities.setCapability(MobileCapabilityType.UDID, udid);
switch (platformName) {
case "Android":
capabilities.setCapability(MobileCapabilityType.NEW_COMMAND_TIMEOUT, 60);
capabilities.setCapability("appPackage", prop.getProperty("androidAppPackage"));
capabilities.setCapability("appActivity", prop.getProperty("androidAppActivity"));
capabilities.setCapability("app", mApkfilepath);
capabilities.setCapability("noReset", true);
driver = new AppiumDriver<MobileElement>(new URL(url), capabilities);
// tdriver.set(driver);
// return getDriver();
case "IOS":
File classpathRoot = new File(System.getProperty("user.dir"));
// File appDir = new File(classpathRoot, "/build/");
// File app = new File(appDir, "WordPress.app");
capabilities.setCapability("platformVersion", "9.2");
capabilities.setCapability("deviceName", "iPhone 6");
capabilities.setCapability("app", app.getAbsolutePath());
// driver = new IOSDriver<MobileElement>(new
// URL("http://127.0.0.1:4723/wd/hub"), caps);
break;
default:
throw new Exception("Invalid platform! - " + platformName);
}
setDriver(driver);
}

Reading and writing the file from and to winSCP from the S3 object store

I am trying to put and read file from the remote file system using winSCP through an SFTP connection. The leaf node of the file system is s3 object store which contain the files (for eg: xyz.txt).
Below is the overridden method of File Channel class.
XYZFileSystemProvider
public class XYZFileSystemProvider extends FileSystemProvider {
#Override
public FileChannel newFileChannel(Path path, Set<? extends OpenOption> options, FileAttribute<?>... attrs)
throws IOException {
// TODO Auto-generated method stub
Collection<XYZOptions.OpenMode> modes = XYZOptions.OpenMode.fromOpenOptions(options);
if (modes.isEmpty()) {
modes = EnumSet.of(XYZOptions.OpenMode.Read, MFEOptions.OpenMode.Write);
}
// TODO: process file attributes
return new XYZFileSystemChannel(path, modes);
}
}
XYZFileSystemChannel
public class XYZFileSystemChannel extends XYZRemotePathChannel{
public XYZFileSystemChannel(XYZPath p, Collection<XYZOptions.OpenMode> modes) throws IOException {
this(Objects.requireNonNull(p, "No target path").toString(), p.getFileSystem(), modes);
}
public XYZFileSystemChannel(String remotePath, XYZFileSystem fs, Collection<XYZOptions.OpenMode> modes) throws IOException {
super(remotePath, fs, true, modes);
}
}
XYZRemotePathChannel
public class XYZRemotePathChannel extends FileChannel {
private AmazonS3Component getAmazonS3Instance() {
return SpringContext.getBean(AmazonS3Component.class);
}
private final String path;
private final Collection<XYZOptions.OpenMode> modes;
private final boolean closeOnExit;
private XYZFileSystem fileSystem;
private final AtomicLong posTracker = new AtomicLong(0L);
public static final Set<XYZOptions.OpenMode> READ_MODES =
Collections.unmodifiableSet(EnumSet.of(XYZOptions.OpenMode.Read));
private final Object lock = new Object();
private final AtomicReference<Thread> blockingThreadHolder = new AtomicReference<>(null);
public XYZRemotePathChannel(String path, XYZFileSystem fileSystem, boolean closeOnExit,
Collection<XYZOptions.OpenMode> modes) throws IOException {
this.path = ValidateUtils.checkNotNullAndNotEmpty(path, "No remote file path specified");
this.modes = Objects.requireNonNull(modes, "No channel modes specified");
this.closeOnExit = closeOnExit;
this.fileSystem = fileSystem;
}
#Override
public int read(ByteBuffer dst) throws IOException {
// TODO Auto-generated method stub
log.debug("Position of dst is : {}",dst.position());
log.debug("Reading the bytes of the file : {}", dst);
//Some code to be done here in order to read dst and send bytes of the file recieved from s3 store
return (int) doRead(Collections.singletonList(dst), -1);
}
protected long doRead(List<ByteBuffer> buffers, long position) throws IOException {
log.debug("Do Reading the bytes of the file of list of buffer : {} and position :{}", buffers , position);
ensureOpen(READ_MODES);
synchronized (lock) {
boolean completed = false;
boolean eof = false;
long curPos = (position >= 0L) ? position : posTracker.get();
byte[] bytes = new byte[(int) curPos];
try {
long totalRead = 0;
beginBlocking();
String [] parts = this.path.toString().replaceFirst("^/", "").split("/");
String bucket = parts[parts.length-2];
String fileName = parts[parts.length-1];
InputStream fileContent = getAmazonS3Instance().getFileFromBucket(bucket, fileName);
log.debug("Contens of the file: {} from bucket: {} are : {}", fileName , bucket, fileContent);
//Some code to be done here to return the content byte length??
int fileLenght = fileContent.read(bytes, 1, (int) curPos);
log.debug("After reading the file content the file length is : {}" , fileLenght );
return fileLenght;
} finally {
if (position < 0L) {
posTracker.set(curPos);
}
endBlocking(completed);
}
}
}
private void endBlocking(boolean completed) throws AsynchronousCloseException {
blockingThreadHolder.set(null);
end(completed);
}
private void beginBlocking() {
begin();
blockingThreadHolder.set(Thread.currentThread());
}
#Override
public FileChannel position(long newPosition) throws IOException {
// TODO Auto-generated method stub
log.debug("Setting the position of the file : {}", newPosition);
if (newPosition < 0L) {
throw new IllegalArgumentException("position(" + this.path + ") illegal file channel position: " + newPosition);
}
ensureOpen(Collections.emptySet());
posTracker.set(newPosition);
return this;
}
private void ensureOpen(Collection<XYZOptions.OpenMode> reqModes) throws IOException {
if (!isOpen()) {
throw new ClosedChannelException();
}
if (GenericUtils.size(reqModes) > 0) {
for (XYZOptions.OpenMode m : reqModes) {
if (this.modes.contains(m)) {
return;
}
}
throw new IOException("ensureOpen(" + this.path + ") current channel modes (" + this.modes
+ ") do contain any of the required: " + reqModes);
}
}
}
XYZOptions
public class XYZOptions {
enum OpenMode {
Read, Write, Append, Create, Truncate, Exclusive;
public static final Set<OpenOption> SUPPORTED_OPTIONS = Collections
.unmodifiableSet(EnumSet.of(StandardOpenOption.READ, StandardOpenOption.APPEND,
StandardOpenOption.CREATE, StandardOpenOption.TRUNCATE_EXISTING, StandardOpenOption.WRITE,
StandardOpenOption.CREATE_NEW, StandardOpenOption.SPARSE));
public static Set<OpenMode> fromOpenOptions(Collection<? extends OpenOption> options) {
if (GenericUtils.isEmpty(options)) {
return Collections.emptySet();
}
Set<OpenMode> modes = EnumSet.noneOf(OpenMode.class);
for (OpenOption option : options) {
if (option == StandardOpenOption.READ) {
modes.add(Read);
} else if (option == StandardOpenOption.APPEND) {
modes.add(Append);
} else if (option == StandardOpenOption.CREATE) {
modes.add(Create);
} else if (option == StandardOpenOption.TRUNCATE_EXISTING) {
modes.add(Truncate);
} else if (option == StandardOpenOption.WRITE) {
modes.add(Write);
} else if (option == StandardOpenOption.CREATE_NEW) {
modes.add(Create);
modes.add(Exclusive);
} else if (option == StandardOpenOption.SPARSE) {
continue;
} else {
throw new IllegalArgumentException("Unsupported open option: " + option);
}
}
return modes;
}
}
}
I am able to fetch the file from the s3 store but nor sure how to read and pass all the contents while someone drag and drop from remote file location to their own system using winSCP. I know i am missing some code at the mentioned place but not sure how to achieve it.

Use Memcache in Dataflow: NullPointerException at NamespaceManager.get

I am trying to access GAE Memcache and Datastore APIs from Dataflow.
I have followed How to use memcache in dataflow? and setup Remote API https://cloud.google.com/appengine/docs/java/tools/remoteapi
In my pipeline I have written
public static void main(String[] args) throws IOException {
RemoteApiOptions remApiOpts = new RemoteApiOptions()
.server("xxx.appspot.com", 443)
.useApplicationDefaultCredential();
RemoteApiInstaller installer = new RemoteApiInstaller();
installer.install(remApiOpts);
try {
DatastoreConfigManager2.registerConfig("myconfig");
final String topic = DatastoreConfigManager2.getString("pubsub.topic");
final String stagingDir = DatastoreConfigManager2.getString("dataflow.staging");
...
bqRows.apply(BigQueryIO.Write
.named("Insert row")
.to(new SerializableFunction<BoundedWindow, String>() {
#Override
public String apply(BoundedWindow window) {
// The cast below is safe because CalendarWindows.days(1) produces IntervalWindows.
IntervalWindow day = (IntervalWindow) window;
String dataset = DatastoreConfigManager2.getString("dataflow.bigquery.dataset");
String tablePrefix = DatastoreConfigManager2.getString("dataflow.bigquery.tablenametemplate");
String dayString = DateTimeFormat.forPattern("yyyyMMdd")
.print(day.start());
String tableName = dataset + "." + tablePrefix + dayString;
LOG.info("Writing to BigQuery " + tableName);
return tableName;
}
})
where DatastoreConfigManager2 is
public class DatastoreConfigManager2 {
private static final DatastoreService DATASTORE = DatastoreServiceFactory.getDatastoreService();
private static final MemcacheService MEMCACHE = MemcacheServiceFactory.getMemcacheService();
static {
MEMCACHE.setErrorHandler(ErrorHandlers.getConsistentLogAndContinue(Level.INFO));
}
private static Set<String> configs = Sets.newConcurrentHashSet();
public static void registerConfig(String name) {
configs.add(name);
}
private static class DatastoreCallbacks {
// https://cloud.google.com/appengine/docs/java/datastore/callbacks
#PostPut
public void updateCacheOnPut(PutContext context) {
Entity entity = context.getCurrentElement();
if (configs.contains(entity.getKind())) {
String id = (String) entity.getProperty("id");
String value = (String) entity.getProperty("value");
MEMCACHE.put(id, value);
}
}
}
private static String lookup(String id) {
String value = (String) MEMCACHE.get(id);
if (value != null) return value;
else {
for (String config : configs) {
try {
PreparedQuery pq = DATASTORE.prepare(new Query(config)
.setFilter(new FilterPredicate("id", FilterOperator.EQUAL, id)));
for (Entity entity : pq.asIterable()) {
value = (String) entity.getProperty("value"); // use last
}
if (value != null) MEMCACHE.put(id, value);
} catch (Exception e) {
e.printStackTrace();
}
}
}
return value;
}
public static String getString(String id) {
return lookup(id);
}
}
When my pipeline runs on Dataflow I get the exception
Caused by: java.lang.NullPointerException
at com.google.appengine.api.NamespaceManager.get(NamespaceManager.java:101)
at com.google.appengine.api.memcache.BaseMemcacheServiceImpl.getEffectiveNamespace(BaseMemcacheServiceImpl.java:65)
at com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.doGet(AsyncMemcacheServiceImpl.java:401)
at com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.get(AsyncMemcacheServiceImpl.java:412)
at com.google.appengine.api.memcache.MemcacheServiceImpl.get(MemcacheServiceImpl.java:49)
at my.training.google.common.config.DatastoreConfigManager2.lookup(DatastoreConfigManager2.java:80)
at my.training.google.common.config.DatastoreConfigManager2.getString(DatastoreConfigManager2.java:117)
at my.training.google.mss.pipeline.InsertIntoBqWithCalendarWindow$1.apply(InsertIntoBqWithCalendarWindow.java:101)
at my.training.google.mss.pipeline.InsertIntoBqWithCalendarWindow$1.apply(InsertIntoBqWithCalendarWindow.java:95)
at com.google.cloud.dataflow.sdk.io.BigQueryIO$Write$Bound$TranslateTableSpecFunction.apply(BigQueryIO.java:1496)
at com.google.cloud.dataflow.sdk.io.BigQueryIO$Write$Bound$TranslateTableSpecFunction.apply(BigQueryIO.java:1486)
at com.google.cloud.dataflow.sdk.io.BigQueryIO$TagWithUniqueIdsAndTable.tableSpecFromWindow(BigQueryIO.java:2641)
at com.google.cloud.dataflow.sdk.io.BigQueryIO$TagWithUniqueIdsAndTable.processElement(BigQueryIO.java:2618)
Any suggestions? Thanks in advance.
EDIT: my functional requirement is building a pipeline with some configurable steps based on datastore entries.

Easy way to dynamically invoke web services (without JDK or proxy classes)

In Python I can consume a web service so easily:
from suds.client import Client
client = Client('http://www.example.org/MyService/wsdl/myservice.wsdl') #create client
result = client.service.myWSMethod("Bubi", 15) #invoke method
print result #print the result returned by the WS method
I'd like to reach such a simple usage with Java.
With Axis or CXF you have to create a web service client, i.e. a package which reproduces all web service methods so that we can invoke them as if they where normal methods. Let's call it proxy classes; usually they are generated by wsdl2java tool.
Useful and user-friendly. But any time I add/modify a web service method and I want to use it in a client program I need to regenerate proxy classes.
So I found CXF DynamicClientFactory, this technique avoids the use of proxy classes:
import org.apache.cxf.endpoint.Client;
import org.apache.cxf.endpoint.dynamic.DynamicClientFactory;
//...
//create client
DynamicClientFactory dcf = DynamicClientFactory.newInstance();
Client client = dcf.createClient("http://www.example.org/MyService/wsdl/myservice.wsdl");
//invoke method
Object[] res = client.invoke("myWSMethod", "Bubi");
//print the result
System.out.println("Response:\n" + res[0]);
But unfortunately it creates and compiles proxy classes runtime, hence requires JDK on the production machine. I have to avoid this, or at least I can't rely on it.
My question:
Is there another way to dinamically invoke any method of a web service in Java, without having a JDK at runtime and without generating "static" proxy classes? Maybe with a different library? Thanks!
I know this is a really old question but if you are still interested you could use soap-ws github project: https://github.com/reficio/soap-ws
Here you have a sample usage really simple:
Wsdl wsdl = Wsdl.parse("http://www.webservicex.net/CurrencyConvertor.asmx?WSDL");
SoapBuilder builder = wsdl.binding()
.localPart("CurrencyConvertorSoap")
.find();
SoapOperation operation = builder.operation()
.soapAction("http://www.webserviceX.NET/ConversionRate")
.find();
Request request = builder.buildInputMessage(operation)
SoapClient client = SoapClient.builder()
.endpointUrl("http://www.webservicex.net/CurrencyConvertor.asmx")
.build();
String response = client.post(request);
As you can see it is really simple.
With CXF 3.x this could be possible with StaxDataBinding. Follow below steps to get the basics. Of course, this could be enhanced to your needs.
Create StaxDataBinding something like below. Note below code can be enhanced to your sophistication.
class StaxDataBinding extends AbstractInterceptorProvidingDataBinding {
private XMLStreamDataReader xsrReader;
private XMLStreamDataWriter xswWriter;
public StaxDataBinding() {
super();
this.xsrReader = new XMLStreamDataReader();
this.xswWriter = new XMLStreamDataWriter();
inInterceptors.add(new StaxInEndingInterceptor(Phase.POST_INVOKE));
inFaultInterceptors.add(new StaxInEndingInterceptor(Phase.POST_INVOKE));
inInterceptors.add(RemoveStaxInEndingInterceptor.INSTANCE);
inFaultInterceptors.add(RemoveStaxInEndingInterceptor.INSTANCE);
}
static class RemoveStaxInEndingInterceptor
extends AbstractPhaseInterceptor<Message> {
static final RemoveStaxInEndingInterceptor INSTANCE = new RemoveStaxInEndingInterceptor();
public RemoveStaxInEndingInterceptor() {
super(Phase.PRE_INVOKE);
addBefore(StaxInEndingInterceptor.class.getName());
}
public void handleMessage(Message message) throws Fault {
message.getInterceptorChain().remove(StaxInEndingInterceptor.INSTANCE);
}
}
public void initialize(Service service) {
for (ServiceInfo serviceInfo : service.getServiceInfos()) {
SchemaCollection schemaCollection = serviceInfo.getXmlSchemaCollection();
if (schemaCollection.getXmlSchemas().length > 1) {
// Schemas are already populated.
continue;
}
new ServiceModelVisitor(serviceInfo) {
public void begin(MessagePartInfo part) {
if (part.getTypeQName() != null
|| part.getElementQName() != null) {
return;
}
part.setTypeQName(Constants.XSD_ANYTYPE);
}
}.walk();
}
}
#SuppressWarnings("unchecked")
public <T> DataReader<T> createReader(Class<T> cls) {
if (cls == XMLStreamReader.class) {
return (DataReader<T>) xsrReader;
}
else {
throw new UnsupportedOperationException(
"The type " + cls.getName() + " is not supported.");
}
}
public Class<?>[] getSupportedReaderFormats() {
return new Class[] { XMLStreamReader.class };
}
#SuppressWarnings("unchecked")
public <T> DataWriter<T> createWriter(Class<T> cls) {
if (cls == XMLStreamWriter.class) {
return (DataWriter<T>) xswWriter;
}
else {
throw new UnsupportedOperationException(
"The type " + cls.getName() + " is not supported.");
}
}
public Class<?>[] getSupportedWriterFormats() {
return new Class[] { XMLStreamWriter.class, Node.class };
}
public static class XMLStreamDataReader implements DataReader<XMLStreamReader> {
public Object read(MessagePartInfo part, XMLStreamReader input) {
return read(null, input, part.getTypeClass());
}
public Object read(QName name, XMLStreamReader input, Class<?> type) {
return input;
}
public Object read(XMLStreamReader reader) {
return reader;
}
public void setSchema(Schema s) {
}
public void setAttachments(Collection<Attachment> attachments) {
}
public void setProperty(String prop, Object value) {
}
}
public static class XMLStreamDataWriter implements DataWriter<XMLStreamWriter> {
private static final Logger LOG = LogUtils
.getL7dLogger(XMLStreamDataWriter.class);
public void write(Object obj, MessagePartInfo part, XMLStreamWriter writer) {
try {
if (!doWrite(obj, writer)) {
// WRITE YOUR LOGIC HOW you WANT TO HANDLE THE INPUT DATA
//BELOW CODE JUST CALLS toString() METHOD
if (part.isElement()) {
QName element = part.getElementQName();
writer.writeStartElement(element.getNamespaceURI(),
element.getLocalPart());
if (obj != null) {
writer.writeCharacters(obj.toString());
}
writer.writeEndElement();
}
}
}
catch (XMLStreamException e) {
throw new Fault("COULD_NOT_READ_XML_STREAM", LOG, e);
}
}
public void write(Object obj, XMLStreamWriter writer) {
try {
if (!doWrite(obj, writer)) {
throw new UnsupportedOperationException("Data types of "
+ obj.getClass() + " are not supported.");
}
}
catch (XMLStreamException e) {
throw new Fault("COULD_NOT_READ_XML_STREAM", LOG, e);
}
}
private boolean doWrite(Object obj, XMLStreamWriter writer)
throws XMLStreamException {
if (obj instanceof XMLStreamReader) {
XMLStreamReader xmlStreamReader = (XMLStreamReader) obj;
StaxUtils.copy(xmlStreamReader, writer);
xmlStreamReader.close();
return true;
}
else if (obj instanceof XMLStreamWriterCallback) {
((XMLStreamWriterCallback) obj).write(writer);
return true;
}
return false;
}
public void setSchema(Schema s) {
}
public void setAttachments(Collection<Attachment> attachments) {
}
public void setProperty(String key, Object value) {
}
}
}
Prepare your input to match the expected input, something like below
private Object[] prepareInput(BindingOperationInfo operInfo, String[] paramNames,
String[] paramValues) {
List<Object> inputs = new ArrayList<Object>();
List<MessagePartInfo> parts = operInfo.getInput().getMessageParts();
if (parts != null && parts.size() > 0) {
for (MessagePartInfo partInfo : parts) {
QName element = partInfo.getElementQName();
String localPart = element.getLocalPart();
// whatever your input data you need to match data value for given element
// below code assumes names are paramNames variable and value in paramValues
for (int i = 0; i < paramNames.length; i++) {
if (paramNames[i].equals(localPart)) {
inputs.add(findParamValue(paramNames, paramValues, localPart));
}
}
}
}
return inputs.toArray();
}
Now set the proper data binding and pass the data
Bus bus = CXFBusFactory.getThreadDefaultBus();
WSDLServiceFactory sf = new WSDLServiceFactory(bus, wsdl);
sf.setAllowElementRefs(false);
Service svc = sf.create();
Client client = new ClientImpl(bus, svc, null,
SimpleEndpointImplFactory.getSingleton());
StaxDataBinding databinding = new StaxDataBinding();
svc.setDataBinding(databinding);
bus.getFeatures().add(new StaxDataBindingFeature());
BindingOperationInfo operInfo = ...//find the operation you need (see below)
Object[] inputs = prepareInput(operInfo, paramNames, paramValues);
client.invoke("operationname", inputs);
If needed you can match operation name something like below
private BindingOperationInfo findBindingOperation(Service service,
String operationName) {
for (ServiceInfo serviceInfo : service.getServiceInfos()) {
Collection<BindingInfo> bindingInfos = serviceInfo.getBindings();
for (BindingInfo bindingInfo : bindingInfos) {
Collection<BindingOperationInfo> operInfos = bindingInfo.getOperations();
for (BindingOperationInfo operInfo : operInfos) {
if (operInfo.getName().getLocalPart().equals(operationName)) {
if (operInfo.isUnwrappedCapable()) {
return operInfo.getUnwrappedOperation();
}
return operInfo;
}
}
}
}
return null;
}

Streaming a File From HDFS Address in Apache Flink

In my Flink code, I am streaming a file which is located on HDFS folder, I get the error " (No such file or directory)", however I am sure the file name and address is correct as I used the same in the batch methods and every thing worked smoothly.
Does any one know what could be the problem?
Here is my code:
DataStream<FebrlObject> myStream =
env.addSource(new MyObjectGenerator("hdfs://../Data/Dataset1.csv"));
and its related class:
public class MyObjectGenerator implements SourceFunction<MyObject> {
private String dataFilePath;
private float servingSpeedFactor;
private Integer rowNo ;
private transient BufferedReader reader;
private transient InputStream inputStream;
public MyObjectGenerator(String dataFilePath) {
this(dataFilePath, 1.0f);
}
public MyObjectGenerator(String dataFilePath, float servingSpeedFactor) {
this.dataFilePath = dataFilePath;
this.servingSpeedFactor = servingSpeedFactor;
rowNo = 0 ;
}
#Override
public void run(SourceContext<MyObject> sourceContext) throws Exception {
long servingStartTime = Calendar.getInstance().getTimeInMillis();
inputStream = new DataInputStream(new FileInputStream(dataFilePath));
reader = new BufferedReader(new InputStreamReader(inputStream));
String line;
long dataStartTime;
rowNo++;
if (reader.ready() && (line = reader.readLine()) != null ) {
MyObject myObject = MyObject.fromString(line);
if (febrlObject!= null )
sourceContext.collect(myObject);
} else {
return;
}
while (reader.ready() && (line = reader.readLine()) != null) {
MyObject myObject = MyObject.fromString(line);
sourceContext.collect( febrlObject );
}
this.reader.close();
this.reader = null;
this.inputStream.close();
this.inputStream = null;
}
#Override
public void cancel() {
try {
if (this.reader != null) {
this.reader.close();
}
if( this.inputStream != null) {
this.inputStream.close();
}
} catch (IOException ioe) {
//
} finally {
this.reader = null;
this.inputStream = null;
}
}
}
You try to access a file in HDFS with Java's regular FileInputStream. FileInputStream can only access the local file system. It does not know anything about talking to HDFS. You need to use the HDFS client to read files from HDFS. See Flink'sFileInputFormat` as an example.
However, I would try to avoid implementing this yourself if possible. You could try to use Flink's FileInputFormat to read the file line wise (returns a DataStream<String>) and a consecutive (flat) mapper that parses the line.

Resources