Unable to hear any output from Text to Speech - codenameone

I am creating my own Codenameone application that uses the Text to Speech functionality in IOS 8. My application uses the same native IOS code as given in the DrSbaitso demo. I can build my application and deploy it to my IPhone successfully, however I am never able to hear any output from the Text to Speech. I have verified that the native interface is getting called, but I cannot hear any sound. Is there something else that needs to be implemented than just the native interface that will call the IOS text to speech functionality? Is there perhaps something I need to enable on my IPhone to use the Text to Speech API? I have listed my native implementation code that I am using.
Header:
#import <Foundation/Foundation.h>
#interface com_testapp_demos_TTSImpl: NSObject {
}
-(void)say:(NSString*)param;
-(BOOL)isSupported;
#end
Source:
#import "com_testapp_demos_TTSImpl.h"
#import <AVFoundation/AVFoundation.h>
#implementation com_testapp_demos_TTSImpl
-(void)say:(NSString*)param{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
AVSpeechSynthesisVoice *voice = [AVSpeechSynthesisVoice voiceWithLanguage:#"en-GB"];
AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:param];
AVSpeechSynthesizer *syn = [[[AVSpeechSynthesizer alloc] init]autorelease];
utterance.rate = 0;
utterance.voice = voice;
[syn speakUtterance:utterance];
[pool release];
}
-(BOOL)isSupported{
return YES;
}
#end

Verify the volume is up and that the device isn't in mute mode.
Notice that in iOS a device may be in mute mode and still play sounds so this is a common mistake!
See AVSpeechSynthesizer works on simulator but not on device

Related

From iOS Objective-C code and Android Java code to a Codename One PeerComponent

At the page https://www.wowza.com/docs/how-to-build-a-basic-app-with-gocoder-sdk-for-ios there are the following examples:
if (self.goCoder != nil) {
// Associate the U/I view with the SDK camera preview
self.goCoder.cameraView = self.view;
// Start the camera preview
[self.goCoder.cameraPreview startPreview];
}
// Start streaming
[self.goCoder startStreaming:self];
// Stop the broadcast that is currently running
[self.goCoder endStreaming:self];
The equivalent Java code for Android is reported at the page https://www.wowza.com/docs/how-to-build-a-basic-app-with-gocoder-sdk-for-android#start-the-camera-preview, it is:
// Associate the WOWZCameraView defined in the U/I layout with the corresponding class member
goCoderCameraView = (WOWZCameraView) findViewById(R.id.camera_preview);
// Start the camera preview display
if (mPermissionsGranted && goCoderCameraView != null) {
if (goCoderCameraView.isPreviewPaused())
goCoderCameraView.onResume();
else
goCoderCameraView.startPreview();
}
// Start streaming
goCoderBroadcaster.startBroadcast(goCoderBroadcastConfig, this);
// Stop the broadcast that is currently running
goCoderBroadcaster.endBroadcast(this);
The code is self-explaining: the first blocks start a camera preview, the second blocks start a streaming and the third blocks stop it. I want the preview and the streaming inside a Codename One PeerComponent, but I didn't remember / understand how I have to modify both these native code examples to return a PeerComponent to the native interface.
(I tried to read again the developer guide but I'm a bit confused on this point).
Thank you
This is the key line in the iOS instructions:
self.goCoder.cameraView = self.view;
Here you define the view that you need to return to the peer and that we can place. You need to change it from self.view to a view object you create. I think you can just allocate a UIView and assign/return that.
For the Android code instead of using the XML code they use there you can use the WOWZCameraView directly and return that as far as I can tell.

How to call a native interface function for IOS in codename one?

I read the following guides for native interface.
https://www.codenameone.com/how-do-i---access-native-device-functionality-invoke-native-interfaces.html
and
https://www.codenameone.com/manual/advanced-topics.html#_native_interfaces
I do the Hello World test and can't find the call for IOS to the native interface in the codename one file.
I did the .h and the .m and the "generate native access". After this, I cannot check out to go on.
My intension is to call a "copy from clipboard" and "paste from clipboard" native from IOS.
How do I call the native interface function hello world in codename one for IOS?
What should I import?
Is there anywhere a complete sample for IOS and native interface?
These are the files what I have now from the tutorial.
OK, the content of h file:
#import <Foundation/Foundation.h>
#interface com_mycompany_crtome_native_callsImpl : NSObject {
}
-(NSString*)helloWorld:(NSString*)param;
-(BOOL)isSupported;
#end
Then the m file:
#import "com_mycompany_crtome_native_callsImpl.h"
#implementation com_mycompany_crtome_native_callsImpl
-(NSString*)helloWorld:(NSString*)param{
NSLog(#"MyApp: %#", param);
return #"Tada";
}
-(BOOL)isSupported{
return YES;
}
#end
Then I have an extra java file called native_calls.java:
package com.mycompany.crtome;
import com.codename1.system.NativeInterface;
public interface native_calls extends NativeInterface {
String helloWorld(String hi);
}
So, I don't know how do I call that from my main java file?
And may you explain the function and the calls step by step?
The code that binds the native interface to the iOS code is generated automatically. To use the native interface just use:
native_call n = NativeLookup.lookup(native_call.class);
if(n != null && n.isSupported()) {
String result = n.helloWorld("Hi There");
}

Does Everyplay support Landscape in iOS6?

I'm integrating Everyplay with my Cocos2d Game.My game only support Landscape orientation.
Everything goes well on iPad.
But When i test on iPhone(iOS6),it throws exception as following when I call "[[Everyplay sharedInstance] showEveryplay]":
reason: 'Supported orientations has no common orientation with the application, and shouldAutorotate is returning YES'
I know orientation mechanism changed in iOS6.So i add this method:
-(BOOL)shouldAutorotate{
return YES;
}
-(NSUInteger)supportedInterfaceOrientations{
return UIInterfaceOrientationMaskLandscape;
}
-(NSUInteger)application:(UIApplication *)application supportedInterfaceOrientationsForWindow:(UIWindow *)window{
return UIInterfaceOrientationMaskAllButUpsideDown;
}
Then "[[Everyplay sharedInstance] showEveryplay]" works without exception, but my game also support Portrait orientation that I do not want to.
How can I do if I only want to support Landscape in My Game, but let "[[Everyplay sharedInstance] showEveryplay]" works without exception?
You have two options how to fix the problem.
Option 1:
Add UISupportedInterfaceOrientations array into your game's info.plist with items UIInterfaceOrientationPortrait, UIInterfaceOrientationLandscapeLeft, UIInterfaceOrientationLandscapeRight and UIInterfaceOrientationPortraitUpsideDown. You can easily do this from xCode by checking all Supported Interface Orientations from your project summary page or by editing the info.plist file manually.
Option 2:
Add the following method to your application's AppDelegate.m file:
// IOS 6
-(NSUInteger)application:(UIApplication *)application supportedInterfaceOrientationsForWindow:(UIWindow *)window {
return UIInterfaceOrientationMaskAll;
}
In both cases you must also make sure that you have added the landscape only orientation handling code to your game's main UIViewController.
// IOS 5
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation {
return UIInterfaceOrientationIsLandscape(toInterfaceOrientation);
}
// IOS 6
- (BOOL)shouldAutorotate {
return YES;
}
- (NSUInteger)supportedInterfaceOrientations {
return UIInterfaceOrientationMaskLandscapeLeft | UIInterfaceOrientationMaskLandscapeRight;
}
On iPhone Everyplay webview is always on portrait mode, but on iPad the webview supports both. Recording supports both modes as does the video player. We will likely in near future update the landscape mode for iPhone resolution too, but it will require some redesign before this task is complete.

How to play song in an iOS6 app with mp3 format from an http link?

Recently i have started working on iOS application development. These days i am working on a Music Player which must have following functionality:
-Online buffering of the song from php web service.
-Play, pause, stop, next song, previous song.
-two sliders, one is for volume control and other is for showing the play time of the song.
-shuffle, repeat song.
I have tried these things with AVPlayer and AVAudioPlayer but in AVAudioPlayer it is not possible to stream the data from url i think because i have tried a lot then i done this by using AVplayer but it is not supporting options like volume control etc. and even the buffering is also not proper like a have to press play button again if the buffering stops at some point. I need an urgent help for this Audio Player any tutorial any example which i can understand easily as i am new in this field.
Here is the code
NSURL *url = [[NSURL alloc] initWithString:#"http://www.androidmobiapps.com/extrememusic/app/songs/1.mp3"];
[self setupAVPlayerForURL:url];
I am calling this song at viewdidload
-(void) setupAVPlayerForURL: (NSURL*) url {
AVAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
AVPlayerItem *anItem = [AVPlayerItem playerItemWithAsset:asset];
audioPlayer = [AVPlayer playerWithPlayerItem:anItem];
[audioPlayer addObserver:self forKeyPath:#"status" options:0 context:nil];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == audioPlayer && [keyPath isEqualToString:#"status"]) {
if (audioPlayer.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (audioPlayer.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayer Ready to Play");
} else if (audioPlayer.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
I've done functions you mentioned above as a class, and open source it. Whatever you wanna use the class directly or treat it as an AVFoundation beginner examples. Good luck!
Watch this repo on GitHub here

The function `CGCMSUtilsGetICCProfileDataWithRenderingIntent' is obsolete. Why does this solution work?

I am maintaining some code written by someone else and when I build on Xcode 4.5 and run on iOS 6 I get this run time "error"
<Error>: The function `CGCMSUtilsGetICCProfileDataWithRenderingIntent' is obsolete and will be removed in an upcoming update. Unfortunately, this application, or a library it uses, is using this obsolete function, and is thereby contributing to an overall degradation of system performance. Please use `CGColorSpaceCopyICCProfile' instead.
when executing this code:
CGColorSpaceRef alternate = CGColorSpaceCreateDeviceRGB();
NSString *iccProfilePath = [[NSBundle mainBundle] pathForResource:#"sRGB Profile" ofType:#"icc"];
NSData *iccProfileData = [NSData dataWithContentsOfFile:iccProfilePath];
CGDataProviderRef iccProfile = CGDataProviderCreateWithCFData((CFDataRef)iccProfileData);
const CGFloat range[] = {0,1,0,1,0,1}; // min/max of the three components
CGColorSpaceRef colorSpace = CGColorSpaceCreateICCBased(3, range, iccProfile, alternate);
CGContextRef context = CGBitmapContextCreate(NULL, pageWidth, pageHeight, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast);
CGDataProviderRelease(iccProfile);
CGColorSpaceRelease(colorSpace);
CGColorSpaceRelease(alternate);
When I run on iOS 5.1 there is no error.
I have found that by making the following changes the error does not appear:
Change:
NSString *iccProfilePath = [[NSBundle mainBundle] pathForResource:#"sRGB Profile" ofType:#"icc"];
NSData *iccProfileData = [NSData dataWithContentsOfFile:iccProfilePath];
CGDataProviderRef iccProfile = CGDataProviderCreateWithCFData((CFDataRef)iccProfileData);
to:
char fname[]= "sRGB Profile.icc";
CGDataProviderRef iccProfile = CGDataProviderCreateWithFilename(fname);
I can't find any reference to CGDataProviderCreateWithCFData being deprecated. Can anyone explain the cause of the problem? It seems as though CGDataProviderCreateWithCFData is using CGCMSUtilsGetICCProfileDataWithRenderingIntent and CGDataProviderCreateWithFilename is using CGColorSpaceCopyICCProfile which suggests to me that CGDataProviderCreateWithCFData is deprecated. I'm not comfortable with the solution I have found because I don't understand this. Also, I hope the solution helps someone.
So, you are attaching the sRGB color profile file to the app resources and then explicitly creating a sRGB color profile at runtime on iOS. Is that needed? This document seems to suggest that the device RGB is actually the sRGB colorspace:
Apple WWDC color management talk
It would be nice if we could just call:
colorspace = CGColorSpaceCreateWithName(kCGColorSpaceSRGB);
But this does not seem to be supported on iOS either.

Resources