I would like to implement this AntMedia iOS and Android native interface for Codename One:
import com.codename1.system.NativeInterface;
import com.codename1.ui.PeerComponent;
/**
* #deprecated Native Interface, deprecated because you normally should use the
* public API in the AntMediaClient class.
*/
public interface AntMediaNative extends NativeInterface {
/**
* Initializes the connection.
*
* #param serverURL is WebSocket url to connect (wss://...)
* #param streamId is the stream id in the server to process
* #param mode true means MODE_PUBLISH, false means MODE_PLAY
* #param tokenId is one time token string
* #return PeerComponent
*/
public PeerComponent createPeer(String serverURL, String streamId, boolean mode, String tokenId);
/**
* Starts the streaming according to the mode.
*/
public void startStream();
/**
* Stops the streaming.
*/
public void stopStream();
/**
* Switches the cameras.
*/
public void switchCamera();
/**
* Toggle microphone.
*
* #return microphone current status.
*/
public boolean toggleMic();
/**
* Stops the video source.
*/
public void stopVideoSource();
/**
* Starts or restarts the video source.
*/
public void startVideoSource();
/**
* Get the error.
*
* #return error or null if not.
*/
public String getError();
/**
* Camera open order.By default front camera is attempted to be opened at
* first, if it is set to false, another camera that is not front will be
* tried to be open.
*
* #param openFrontCamera if it is true, front camera will tried to be
* opened, if it is false, another camera that is not front will be tried to
* be opened
*/
public void setOpenFrontCamera(boolean openFrontCamera);
}
I need help on two specific issues.
The first problem is how to get the PeerComponent in which to view the live streaming. I don't understand what I have to do in this case in the native Android and iOS code. Could you answer me with an example code for iOS and Android that returns a PeerComponent? Below are the links to the SDKs documentation, I hope it is enough to answer this question.
The second problem is that the SDK for iOS is written in Swift: how do I call the Swift code from a native interface that must be written in Objective-C? Could you answer me with a code example here too?
Thank you for your support.
This is the documentation of the two SDKs:
Android SDK documentation:
https://github.com/ant-media/Ant-Media-Server/wiki/WebRTC-Android-SDK-Documentation
iOS SDK documentation:
https://github.com/ant-media/Ant-Media-Server/wiki/WebRTC-iOS-SDK-Documentation
When you use the Generate Native Interface tool in the IDE it generates matching native code. That code generates native OS methods for each operating system e.g. in the case of Android the createPeer method will return a View.
So for this case you would need to create an instance of org.webrtc.SurfaceViewRenderer and store it in the class (for followup calls of init) then return that from the createPeer method.
Related
I am trying to get messages from kafka and send it to RSocket using Spring. Posting Server side on Spring Java and client side with React
#Configuration
#EnableConfigurationProperties(RsocketConsumerProperties.class)
public class RsocketConsumerConfiguration {
#Bean
public Function<Integer, Mono<Integer>> rsocketConsumer(RSocketRequester.Builder builder,
RsocketConsumerProperties rsocketConsumerProperties) {
RSocketRequester rSocketRequester = builder.websocket(URI.create("ws://localhost:7000/"));
return input -> rSocketRequester.route(rsocketConsumerProperties.getRoute()).data(input).retrieveMono(Integer.class);
}
}
#EnableBinding(Sink.class)
public class Listener {
#Autowired
private Function<Integer, Mono<Integer>> rsocketConsumer;
#StreamListener(Sink.INPUT)
public void fireAndForget(Integer val) {
System.out.println(val);
rsocketConsumer.apply(val).subscribe();
}
}
#Controller
public class ServerController {
#MessageMapping("data")
public Mono<Integer> hello(Integer integer) {
return Mono.just(integer);
}
}
What do i do wrong in server side because my client is connected but not able to get new messages
client.connect().subscribe({
onComplete: socket => {
socket.fireAndForget({
data: { message: "hello from javascript!" },
metadata: null
});
},
onError: error => {
console.log("got error");
console.error(error);
},
onSubscribe: cancel => {
/* call cancel() to abort */
console.log("subscribe!");
console.log(cancel);
// cancel.cancel();
}
});
You do this requester.route("input").data("Welcome to Rsocket").send(); where we have this:
/**
* Perform a {#link RSocket#fireAndForget fireAndForget} sending the
* provided data and metadata.
* #return a completion that indicates if the payload was sent
* successfully or not. Note, however that is a one-way send and there
* is no indication of whether or how the event was handled on the
* remote end.
*/
Mono<Void> send();
You see - Mono? That means that it has to be subscribed to initiate a reactive stream processing. See project Reactor for more info: https://projectreactor.io/
On the other hand it is not clear what is server and what is client in your case...
you do this:
/**
* Build an {#link RSocketRequester} with an
* {#link io.rsocket.core.RSocketClient} that connects over WebSocket to
* the given URL. The requester can be used to make requests
* concurrently. Requests are made over a shared connection that is also
* re-established as needed when further requests are made.
* #param uri the URL to connect to
* #return the created {#code RSocketRequester}
* #since 5.3
*/
RSocketRequester websocket(URI uri);
And I would say it means client in the code you show. The server is on the other side where that 7000 port is opened for ws:// protocol. So, be sure that you understand and configure all the parts properly. For example I don't see why do you need a #RestController in your Listener class...
Since I'm implementing a custom gallery for Android and iOS, I have to access directly to the gallery files stored in the FileSystemStorage through native interfaces.
The basic idea is to retrieve the file list through a native interface, and then make a cross-platform GUI in Codename One. This works on Android, I had to make the thumbs generation (in the Codename One side, not in the native interface side) as fast as possible and the overall result is quite acceptable.
On iOS, I have an additional issue, that is the HEIC image file format, that needs to be converted in JPEG to become usable in Codename One. Basically, I get the file list through the code in this question (I'm waiting for an answer...), then I have to convert each HEIC file to a temporary JPEG file, but my HEICtoJPEG native interface makes the app crashing after few images with an "out of memory" Xcode message...
I suspect that the problematic code is the following, maybe the UIImage* image and/or the NSData* mediaData are never released:
#import "myapp_utilities_HEICtoJPEGNativeImpl.h"
#implementation myapp_utilities_HEICtoJPEGNativeImpl
-(NSData*)heicToJpeg:(NSData*)param{
UIImage* image = [UIImage imageWithData:param];
NSData* mediaData = UIImageJPEGRepresentation(image, 0.9);
return mediaData;
}
-(BOOL)isSupported{
return YES;
}
#end
This is the Java native interface:
import com.codename1.system.NativeInterface;
/**
* #deprecated
*/
public interface HEICtoJPEGNative extends NativeInterface {
public byte[] heicToJpeg(byte[] heicInput);
}
and this the Java public API:
import com.codename1.io.FileSystemStorage;
import com.codename1.io.Util;
import com.codename1.system.NativeLookup;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
public class HEICtoJPEG {
private static HEICtoJPEGNative nativeInterface = NativeLookup.create(HEICtoJPEGNative.class);
/**
* Public API to convert an HEIC file to a new JPEG file (placed in /heic)
* #param heicFile in the FileSystemStorage
* #return a new file (with unique name)
*/
public static String convertToJPEG(String heicFile) throws IOException {
if (nativeInterface != null && nativeInterface.isSupported()) {
// It ensures that the directory exists.
FileSystemStorage fss = FileSystemStorage.getInstance();
String heicDir = fss.getAppHomePath() + "/heic";
if (!fss.isDirectory(heicDir)) {
fss.mkdir(heicDir);
}
ByteArrayOutputStream outHeic = new ByteArrayOutputStream();
InputStream inHeic = fss.openInputStream(heicFile);
Util.copy(inHeic, outHeic);
byte[] heicData = outHeic.toByteArray();
byte[] jpegData = nativeInterface.heicToJpeg(heicData);
String jpegFile = heicDir + "/" + DeviceUtilities.getUniqueId() + ".jpg";
OutputStream outJpeg = fss.openOutputStream(jpegFile);
ByteArrayInputStream inJpeg = new ByteArrayInputStream(jpegData);
Util.copy(inJpeg, outJpeg);
return jpegFile;
} else {
return null;
}
}
}
Since the Android counterpart works, I hope that the rest of my custom gallery code is fine and that this out-of-memory issue is inside code I posted here.
I hope you can indicate me a working solution. Thank you
There was a memory leak in the way that the iOS port invoked native interface methods which received or returned primitive arrays (byte[], int[], etc..).
I have just committed a fix for this (native interface invocations are now wrapped in an autorelease pool) which will be available on the build server next Friday (October 9, 2020).
EDIT: (Friday October 2, 2020)
This fix has been deployed to the build server already so it you should be able to build it again immediately and see if it fixes your issue.
I'm a bit confused about Storage and FileSystemStorage. I wrote the following methods, but I'm sure that they don't work as expected, because .contains("/") is not enough to distinguish if we are using Storage or FileSystemStorage.
Could you please help me to fix them? Thank you
/**
* Get an InputStream for the given sourceFile, it automatically chooses
* FileSystem API or Storage API
*
* #param sourceFile
* #return
* #throws java.io.IOException
*/
public static InputStream getInputStream(String sourceFile) throws IOException {
if (sourceFile.contains("/")) {
return FileSystemStorage.getInstance().openInputStream(sourceFile);
} else {
// Storage is a flat file system
return Storage.getInstance().createInputStream(sourceFile);
}
}
/**
* Get an OutputStream for the given sourceFile, it automatically chooses
* FileSystem API or Storage API
*
* #param destFile
* #return
* #throws java.io.IOException
*/
public static OutputStream getOutputStream(String destFile) throws IOException {
if (destFile.contains("/")) {
return FileSystemStorage.getInstance().openOutputStream(destFile);
} else {
// Storage is a flat file system
return Storage.getInstance().createOutputStream(destFile);
}
}
Actually they should be pretty good. In theory storage would allow you to use / as part of the file name but honestly it isn't something we've tested and I'm not sure if that's the right thing to do.
FileSystemStorage requires an absolute path and as such will always include a slash character. So this should work fine. Technically a FileSystemStorage path should start with file:// but APIs often work without it to make native code integration easier so that's not a great way to distinguish the API.
By default, CXF 3.0.5's JAXRSBeanValidationInInterceptor and JAXRSBeanValidationOutInterceptor do not support validation of arguments passed to request-scoped resource beans. This exclusion is enforced in org.apache.cxf.jaxrs.validation.ValidationUtils.getResourceInstance(Message). Attempting to use a request-scoped resource results in the following warning being logged:
Service object is not a singleton, use a custom invoker to validate
I've spent some time poking around and come up with the following workaround:
/**
* This is a replacement for CXF's builtin
* {#link JAXRSBeanValidationInInterceptor}. This customization supports
* validation of messages handled by non-singleton JAX-RS resource beans. This
* is needed as many of the beans in this project are request-scoped.
*/
public class BeanValidationInInterceptor extends
JAXRSBeanValidationInInterceptor {
/**
* This is a customization of the code in CXF's builtin
* {#link ValidationUtils#getResourceInstance(Message)}.
*
* #see org.apache.cxf.jaxrs.validation.JAXRSBeanValidationInInterceptor#getServiceObject(org.apache.cxf.message.Message)
*/
#Override
protected Object getServiceObject(Message message) {
final OperationResourceInfo ori = message.getExchange().get(
OperationResourceInfo.class);
if (ori == null) {
return null;
}
if (!ori.getClassResourceInfo().isRoot()) {
return message.getExchange().get(
"org.apache.cxf.service.object.last");
}
final ResourceProvider resourceProvider = ori.getClassResourceInfo()
.getResourceProvider();
return resourceProvider.getInstance(message);
}
}
It seems to work, but as I don't fully understand the reason this wasn't supported in the first place, I'm wondering if it's safe?
Any CXF devs around who can explain if/how I'm shooting myself in the foot here, and what I might do instead?
I converted the ebcdic input to ascii using this code using apache camel netty..
How to convert binary input to ascii? I tried with all the available charsetutil but its not working..
Any suggestions or answers available...
import java.beans.Encoder;
import org.apache.camel.main.Main;
import org.jboss.netty.handler.codec.frame.LengthFieldBasedFrameDecoder;
import org.jboss.netty.handler.codec.frame.LengthFieldPrepender;
import org.jboss.netty.handler.codec.string.StringDecoder;
import org.jboss.netty.handler.codec.string.StringEncoder;
import org.jboss.netty.util.CharsetUtil;
/**
* Starting point for application
* #author SubramaniMohanam
*/
public class MainApp {
/**
* Main method
* Encoders and decoders are added here
* Route builders are added
*/
#SuppressWarnings("deprecation")
public static void main(String... args) throws Exception {
Main main = new Main();
main.enableHangupSupport();
System.out.println("main started ...");
main.bind("decoder", new LengthFieldBasedFrameDecoder(40, 0, 1,0,0));
main.bind("decoder", new LengthFieldBasedFrameDecoder(1024, 4, 2,0,17));
main.bind( "stringEncoder", new StringEncoder("Cp1047"));
main.bind("stringDecoder", new StringDecoder("Cp1047"));
main.addRouteBuilder(new StationRouteBuilder());
main.run(args);
}
}
I guess that your question is, how to decode EBCDIC binary data to ASCII data? If this is the case, have a look at Converting EBCDIC to ASCII in java and write your own Camel decoder. More information about encoders/decoders can be found here: Can someone better explain Decoders/Encoders?
That said, all encoders/decoders should have a unique binding name (you used "decoder" twice).