I called captureAudio method from Capture class.
It opens an empty dialog box on IOS 7, with save/cancel buttons.
No audio bar shown to user understands recording.
It's ok on android.
Since iOS doesn't have a capture UI like Androids this is implemented entirely in Java. You can write your own implementation of this rather easily e.g. this is from the Codename One IOSImplementation.java file that does exactly that:
public void captureAudio(ActionListener response) {
String p = FileSystemStorage.getInstance().getAppHomePath();
if(!p.endsWith("/")) {
p += "/";
}
try {
final Media media = MediaManager.createMediaRecorder(p + "cn1TempAudioFile", MediaManager.getAvailableRecordingMimeTypes()[0]);
media.play();
boolean b = Dialog.show("Recording", "", "Save", "Cancel");
final Dialog d = new Dialog("Recording");
media.pause();
media.cleanup();
d.dispose();
if(b) {
response.actionPerformed(new ActionEvent(p + "cn1TempAudioFile"));
} else {
FileSystemStorage.getInstance().delete(p + "cn1TempAudioFile");
response.actionPerformed(null);
}
} catch(IOException err) {
err.printStackTrace();
response.actionPerformed(null);
}
}
Related
when i got to know of the mediamanager api I used it to create a video player just as it was said the tutorial. It runs on the simulator but does not work on my phone.
What do you think is the problem.My Computer runs with a Windows 8 Pro and I tried the app on x-tigi phone and tablet. I got the code from a pdf named Codename One Developer Guide This is the code:
final Form hi = new Form("MediaPlayer", new BorderLayout());
hi.setToolbar(new Toolbar());
Style s = UIManager.getInstance().getComponentStyle("Title");
FontImage icon = FontImage.createMaterial(FontImage.MATERIAL_VIDEO_LIBRARY, s);
hi.getToolbar().addCommandToRightBar("", icon, (evt) -> {
Display.getInstance().openGallery((e) -> {
if(e != null && e.getSource() != null) {
String file = (String)e.getSource();
try {
Media video = MediaManager.createMedia(file, true);
hi.removeAll();
hi.add(BorderLayout.CENTER, new MediaPlayer(video));
hi.revalidate();
} catch(IOException err) {
Log.e(err);
}
}
}, Display.GALLERY_VIDEO);
});
hi.show();
}
I am using CrossMobile to create an app and I want to use the camera to capture and save photos from my app. I will also need to access the photos taken from the app to show them in a list. How can I present the camera view on a button press?
First of all you might need the CoreImage Plugin, or else some specific permissions will not be available.
Under iOS you also need to add the NSCameraUsageDescription key in the Info.plist by hand, (or else the application will crash due to Apple's limitation).
Let's assume that you have a UIButton with name cameraButton and an UIImageView with name imgV, both initialized in the loadView section of your code.
Then the core would be similar to:
public void loadView() {
// ....
cameraButton.addTarget((sender, event) -> {
if (!UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.Camera))
new UIAlertView(null, "Unable to access camera", null, "Accept").show();
else {
UIImagePickerController picker = new UIImagePickerController();
picker.setSourceType(UIImagePickerControllerSourceType.Camera);
picker.setDelegate(new UIImagePickerControllerDelegate() {
#Override
public void didFinishPickingMediaWithInfo(UIImagePickerController picker, Map<String, Object> info) {
picker.dismissModalViewControllerAnimated(true);
UIImage img = (UIImage) info.get(UIImagePickerController.OriginalImage);
imgV.setImage(img);
}
#Override
public void didCancel(UIImagePickerController picker) {
picker.dismissModalViewControllerAnimated(true);
}
});
presentModalViewController(picker, true);
}
}, UIControlEvents.TouchUpInside);
// ....
}
How can I let users re-adjust and crop the image as in viber or fb app so that the image uploaded are proportionally scaled to the size we wanted.
Code:
profileImg.addActionListener((e) -> {
Display.getInstance().openGallery(new ActionListener() {
public void actionPerformed(ActionEvent evt) {
try {
if (evt == null) {
// System.out.println("user cancelled");
return;
}
profileImgpath = (String) evt.getSource();
Image i = Image.createImage(profileImgpath);
Image profileImgg = i.scaledWidth(Display.getInstance().getDisplayWidth() / 3);
profileImg.getParent().revalidate();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}, Display.GALLERY_IMAGE);
});
You would need to build your own component to do that. After you retrieve the image from the gallery, you would display the image in your widget, which would allow users to do pinch zoom, and move the photo, etc...
I don't have an example on hand to do this, as it's a pretty specific component you need to build. There is lots of documentation in the CN1 website and developer guide on building custom components, and handling user interaction (e.g. pressing, dragging, etc...).
I am trying the captureAudio example given in the codenameone documentation https://gist.githubusercontent.com/codenameone/a347dc9dcadaa759d0cb/raw/089f171a37e43f558ce897a0b51cab46219c37c0/CaptureAudioSample.java
Copying the code here for convenience:
Form hi = new Form("Capture", BoxLayout.y());
hi.setToolbar(new Toolbar());
Style s = UIManager.getInstance().getComponentStyle("Title");
FontImage icon = FontImage.createMaterial(FontImage.MATERIAL_MIC, s);
FileSystemStorage fs = FileSystemStorage.getInstance();
String recordingsDir = fs.getAppHomePath() + "recordings/";
fs.mkdir(recordingsDir);
try {
for(String file : fs.listFiles(recordingsDir)) {
MultiButton mb = new MultiButton(file.substring(file.lastIndexOf("/") + 1));
mb.addActionListener((e) -> {
try {
Media m = MediaManager.createMedia(recordingsDir + file, false);
m.play();
} catch(IOException err) {
Log.e(err);
}
});
hi.add(mb);
}
hi.getToolbar().addCommandToRightBar("", icon, (ev) -> {
try {
String file = Capture.captureAudio();
if(file != null) {
SimpleDateFormat sd = new SimpleDateFormat("yyyy-MMM-dd-kk-mm");
String fileName =sd.format(new Date());
String filePath = recordingsDir + fileName;
Util.copy(fs.openInputStream(file), fs.openOutputStream(filePath));
MultiButton mb = new MultiButton(fileName);
mb.addActionListener((e) -> {
try {
Media m = MediaManager.createMedia(filePath, false);
m.play();
} catch(IOException err) {
Log.e(err);
}
});
hi.add(mb);
hi.revalidate();
}
} catch(IOException err) {
Log.e(err);
}
});
} catch(IOException err) {
Log.e(err);
}
hi.show();
I am using Intellij and trying to test the above code in device simulator. But when I click the mic button I see a 'file chooser' dialog box with only a cancel option enabled. On clicking cancel nothing happens. If I choose a wav file then click ok, it gets copied and i am able to play it in simulator. Is mic input not supported in simulator? Is it getting replaced with file input? Or am I doing anything wrong?
We don't capture from the mic in the simulator as we consider the file chooser more convenient for debugging and simulating actual capture cases. This allows us to reproduce failures/test cases with 100% accuracy.
Also the media API's in JavaSE are "flaky" and we don't want to rely on them anymore than we have to.
On the device you will get a recorder interface as usual.
Notice that this is true for image, video capture and picking from the picture gallery too.
While downloading a file, it shows download progress in the notification or somewhere.
But i think it is not by default in cn1 app. I want to add progress listener. How to make it work??
if (!FileSystemStorage.getInstance().exists(filename)) {
com.codename1.io.Util.downloadUrlToFile(PdfUrl, filename, true);
}
In my case used the code below.
/**
* Adaptation of Util.downloadUrlTo
*/
private boolean downloadUrlToAdapt(String url, final String fileName, boolean storage, final Slider slider) {
final ConnectionRequest cr = new ConnectionRequest();
cr.setPost(false);
cr.setFailSilently(true);
cr.setUrl(url);
if (storage) {
cr.setDestinationStorage(fileName);
} else {
cr.setDestinationFile(fileName);
}
NetworkManager.getInstance().addProgressListener(new ActionListener() {
public void actionPerformed(ActionEvent evt) {
if (evt instanceof NetworkEvent) {
NetworkEvent e = (NetworkEvent) evt;
if (e.getProgressPercentage() >= 0) {
slider.setText(e.getProgressPercentage() + "%");
slider.setProgress(e.getProgressPercentage());
}
}
}
});
NetworkManager.getInstance().addToQueueAndWait(cr);
return cr.getResponseCode() == 200;
}
I needed to show video download progress. I hope it helps.
The way the browser downloads a file locally is a special case for browsers and unrelated to apps. You can just invoke Display.execute with a file and the browser will download it that way although I'm guessing its not what you want since it will not be accessible to you after the fact.
You can show progress using NetworkManager's progress listener. Showing the progress in the notification area is an Android specific behavior and uncommon on iOS. But you might be able to use some of the local notification features https://www.codenameone.com/blog/local-notifications.html
I used it the same way as Sadart Abukari.
Only thing I changed is I used the ToastBar.Status instead to display the progress
[...]
NetworkManager.getInstance().addProgressListener((evt) -> {
if (evt instanceof NetworkEvent) {
NetworkEvent e = (NetworkEvent) evt;
if (e.getProgressPercentage() >= 0) {
status.setProgress(e.getProgressPercentage());
}
}
});
NetworkManager.getInstance().addToQueueAndWait(cr);
//Clear the ToastBar
status.clear();
return cr.getResponseCode() == 200;
}