I am using the following code to play a recording. In iOS 6 the output is automatically directed to the speaker / headphones. However in iOS 7 the audio is directed to the earpiece. Has AVAudioPlayer behaviour changed in iOS 7?
if (!recorder.recording){
player = [[AVAudioPlayer alloc] initWithContentsOfURL:recorder.url error:nil];
[player setDelegate:(id)self];
[player prepareToPlay];
[player play];
}
iOS 7 just changes defaults and now audio is routed to headphones by default. To change defaults insert this snippet anywhere in your code, doesn't matter before you instantiate AVAudioPlayer or after.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
Related
I am creating my own Codenameone application that uses the Text to Speech functionality in IOS 8. My application uses the same native IOS code as given in the DrSbaitso demo. I can build my application and deploy it to my IPhone successfully, however I am never able to hear any output from the Text to Speech. I have verified that the native interface is getting called, but I cannot hear any sound. Is there something else that needs to be implemented than just the native interface that will call the IOS text to speech functionality? Is there perhaps something I need to enable on my IPhone to use the Text to Speech API? I have listed my native implementation code that I am using.
Header:
#import <Foundation/Foundation.h>
#interface com_testapp_demos_TTSImpl: NSObject {
}
-(void)say:(NSString*)param;
-(BOOL)isSupported;
#end
Source:
#import "com_testapp_demos_TTSImpl.h"
#import <AVFoundation/AVFoundation.h>
#implementation com_testapp_demos_TTSImpl
-(void)say:(NSString*)param{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
AVSpeechSynthesisVoice *voice = [AVSpeechSynthesisVoice voiceWithLanguage:#"en-GB"];
AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:param];
AVSpeechSynthesizer *syn = [[[AVSpeechSynthesizer alloc] init]autorelease];
utterance.rate = 0;
utterance.voice = voice;
[syn speakUtterance:utterance];
[pool release];
}
-(BOOL)isSupported{
return YES;
}
#end
Verify the volume is up and that the device isn't in mute mode.
Notice that in iOS a device may be in mute mode and still play sounds so this is a common mistake!
See AVSpeechSynthesizer works on simulator but not on device
I'm integrating Everyplay with my Cocos2d Game.My game only support Landscape orientation.
Everything goes well on iPad.
But When i test on iPhone(iOS6),it throws exception as following when I call "[[Everyplay sharedInstance] showEveryplay]":
reason: 'Supported orientations has no common orientation with the application, and shouldAutorotate is returning YES'
I know orientation mechanism changed in iOS6.So i add this method:
-(BOOL)shouldAutorotate{
return YES;
}
-(NSUInteger)supportedInterfaceOrientations{
return UIInterfaceOrientationMaskLandscape;
}
-(NSUInteger)application:(UIApplication *)application supportedInterfaceOrientationsForWindow:(UIWindow *)window{
return UIInterfaceOrientationMaskAllButUpsideDown;
}
Then "[[Everyplay sharedInstance] showEveryplay]" works without exception, but my game also support Portrait orientation that I do not want to.
How can I do if I only want to support Landscape in My Game, but let "[[Everyplay sharedInstance] showEveryplay]" works without exception?
You have two options how to fix the problem.
Option 1:
Add UISupportedInterfaceOrientations array into your game's info.plist with items UIInterfaceOrientationPortrait, UIInterfaceOrientationLandscapeLeft, UIInterfaceOrientationLandscapeRight and UIInterfaceOrientationPortraitUpsideDown. You can easily do this from xCode by checking all Supported Interface Orientations from your project summary page or by editing the info.plist file manually.
Option 2:
Add the following method to your application's AppDelegate.m file:
// IOS 6
-(NSUInteger)application:(UIApplication *)application supportedInterfaceOrientationsForWindow:(UIWindow *)window {
return UIInterfaceOrientationMaskAll;
}
In both cases you must also make sure that you have added the landscape only orientation handling code to your game's main UIViewController.
// IOS 5
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation {
return UIInterfaceOrientationIsLandscape(toInterfaceOrientation);
}
// IOS 6
- (BOOL)shouldAutorotate {
return YES;
}
- (NSUInteger)supportedInterfaceOrientations {
return UIInterfaceOrientationMaskLandscapeLeft | UIInterfaceOrientationMaskLandscapeRight;
}
On iPhone Everyplay webview is always on portrait mode, but on iPad the webview supports both. Recording supports both modes as does the video player. We will likely in near future update the landscape mode for iPhone resolution too, but it will require some redesign before this task is complete.
I have an app on the app store which plays a selection of videos. Currently all of the videos are in the .mov file format but this makes the size of the app rather large so i'm trying to use a different file format to reduce the overall size of the app.
I am trying to use the mp4 format as this is reducing the size of each video by more than a half but when I do, the app crashes when I try to play the video with the following error message:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[NSURL initFileURLWithPath:]: nil string parameter
I have used the following code for each video in my implementation file and changed the file name and type to match the new video so I don't understand why there should be a problem with the file path.
- (IBAction)playDaresWins:(id)sender {
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"DaresWins" ofType:#"mov"]];
_moviePlayer =
[[MPMoviePlayerController alloc]
initWithContentURL:url];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(moviePlayBackDidFinish:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:_moviePlayer];
_moviePlayer.controlStyle = MPMovieControlStyleNone;
_moviePlayer.shouldAutoplay = YES;
[self.view addSubview:_moviePlayer.view];
[_moviePlayer setFullscreen:YES animated:NO];
}
Am I missing something?
.mov isn't a video format or codec, it's a container. The developer documentation provides a list of supported video codecs, bit rates, and resolution (link here - I won't post them here as they can change from OS version to OS version).
However, I don't think that's the problem, because it looks as if you're getting an exception when you're creating the NSURL, not when you're playing the video. That suggests that the path you're providing for your video doesn't exist. Are you sure you have a) the right filename, b) the right extension (perhaps it's MP4 instead of MOV), or c) have added the movie file into your project correctly?
I want to achieve a functionality in which I use the GPS for the loaction updates and when the user is not moving, I want the GPS updates to be paused.
AS per the "Staying on track with Location services " video, I did this:
//Configure Location Manager
self.theLocationManager = [[CLLocationManager alloc]init];
[self.theLocationManager setDelegate:self];
[self.theLocationManager setDesiredAccuracy:kCLLocationAccuracyBest];
[self.theLocationManager setActivityType:CLActivityTypeFitness];
NSLog(#"Activity Type Set: CLActivityTypeFitness");
[self.theLocationManager setPausesLocationUpdatesAutomatically:YES];
[self.theLocationManager startUpdatingLocation];
Delegates:
- (void)locationManager:(CLLocationManager *)manager
didUpdateLocations:(NSArray *)locations
{
NSLog(#"Started location Updates");
}
- (void)locationManagerDidPauseLocationUpdates:(CLLocationManager *)manager
{
NSLog(#"Pausing location Updates");
}
- (void)locationManagerDidResumeLocationUpdates:(CLLocationManager *)manager
{
NSLog(#"Resuming location Updates");
UIAlertView *pop = [[UIAlertView alloc]initWithTitle:#"Info" message:#"Location Updates resumed" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[pop show];
[pop release];
}
I did not get the desired behavior and the delegates "didPause.." and "didResume..." were not even called when the device was Idle.
Ideally GPS updates must stop and resume depending on the state of the device and CLActivityType but its not the case here.
Can anyone help me, what am I doing wrong?
Thanks in advance.
The pause should occur if you let the device at a same location for about 15 minutes.
You probably should use region location when locationManagerDidPauseLocationUpdates: listener is called. Please see update to the first answer in this post:
iOS 6 AutoPause doesn't work
I am maintaining some code written by someone else and when I build on Xcode 4.5 and run on iOS 6 I get this run time "error"
<Error>: The function `CGCMSUtilsGetICCProfileDataWithRenderingIntent' is obsolete and will be removed in an upcoming update. Unfortunately, this application, or a library it uses, is using this obsolete function, and is thereby contributing to an overall degradation of system performance. Please use `CGColorSpaceCopyICCProfile' instead.
when executing this code:
CGColorSpaceRef alternate = CGColorSpaceCreateDeviceRGB();
NSString *iccProfilePath = [[NSBundle mainBundle] pathForResource:#"sRGB Profile" ofType:#"icc"];
NSData *iccProfileData = [NSData dataWithContentsOfFile:iccProfilePath];
CGDataProviderRef iccProfile = CGDataProviderCreateWithCFData((CFDataRef)iccProfileData);
const CGFloat range[] = {0,1,0,1,0,1}; // min/max of the three components
CGColorSpaceRef colorSpace = CGColorSpaceCreateICCBased(3, range, iccProfile, alternate);
CGContextRef context = CGBitmapContextCreate(NULL, pageWidth, pageHeight, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast);
CGDataProviderRelease(iccProfile);
CGColorSpaceRelease(colorSpace);
CGColorSpaceRelease(alternate);
When I run on iOS 5.1 there is no error.
I have found that by making the following changes the error does not appear:
Change:
NSString *iccProfilePath = [[NSBundle mainBundle] pathForResource:#"sRGB Profile" ofType:#"icc"];
NSData *iccProfileData = [NSData dataWithContentsOfFile:iccProfilePath];
CGDataProviderRef iccProfile = CGDataProviderCreateWithCFData((CFDataRef)iccProfileData);
to:
char fname[]= "sRGB Profile.icc";
CGDataProviderRef iccProfile = CGDataProviderCreateWithFilename(fname);
I can't find any reference to CGDataProviderCreateWithCFData being deprecated. Can anyone explain the cause of the problem? It seems as though CGDataProviderCreateWithCFData is using CGCMSUtilsGetICCProfileDataWithRenderingIntent and CGDataProviderCreateWithFilename is using CGColorSpaceCopyICCProfile which suggests to me that CGDataProviderCreateWithCFData is deprecated. I'm not comfortable with the solution I have found because I don't understand this. Also, I hope the solution helps someone.
So, you are attaching the sRGB color profile file to the app resources and then explicitly creating a sRGB color profile at runtime on iOS. Is that needed? This document seems to suggest that the device RGB is actually the sRGB colorspace:
Apple WWDC color management talk
It would be nice if we could just call:
colorspace = CGColorSpaceCreateWithName(kCGColorSpaceSRGB);
But this does not seem to be supported on iOS either.