Faces detected on simulator but not on iphone using CoreImage framework - face-detection

I'm using CoreImage to detect faces on pictures. It works great on the simulator, but on my iphone 5, it almost never works with pictures taken with the iphone's camera ( it works with pictures picked on the web ).
The following code shows how I detect the faces. For every pictures, the application logs
step 1 : image will be processed
But it only logs
step 2 : face detected
for few of them, whereas almost every faces are detected on the simulator or if I use pictures from the web.
var context: CIContext = {
return CIContext(options: nil)
}()
let detector = CIDetector(ofType: CIDetectorTypeFace,
context: context,
options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])
let imageView = mainPic
for var index = 0; index < picsArray.count; index++ {
if !(picsArray.objectAtIndex(index).objectAtIndex(1) as! Bool) {
var wholeImageData: AnyObject = picsArray.objectAtIndex(index)[0]
if wholeImageData.isKindOfClass(NSData) {
let wholeImage: UIImage = UIImage(data: wholeImageData as! NSData)!
if wholeImage.isKindOfClass(UIImage) {
NSLog("step 1 : image will be processed")
let processedImage = wholeImage
let inputImage = CIImage(image: processedImage)
var faceFeatures: [CIFaceFeature]!
if let orientation: AnyObject = inputImage.properties()?[kCGImagePropertyOrientation] {
faceFeatures = detector.featuresInImage(inputImage, options: [CIDetectorImageOrientation: orientation]) as! [CIFaceFeature]
} else {
faceFeatures = detector.featuresInImage(inputImage) as! [CIFaceFeature]
}
let inputImageSize = inputImage.extent().size
var transform = CGAffineTransformIdentity
transform = CGAffineTransformScale(transform, 1, -1)
transform = CGAffineTransformTranslate(transform, 0, -inputImageSize.height)
for faceFeature in faceFeatures {
NSLog("step 2 : face detected")
// ...
I've been looking for a solution for three hours now, and I'm quite desperate :).
Any suggestion would be really appreciated !
Thanks in advance.

I found a really weird way to solve my problem.
By setting the allowsEditing property of UIImagePickerController() to true when picking my pictures, everything works fine... I can't understand why, but it works.

Related

How do I fill an array of data with CoreData data?

I am learning to use SwiftUI with Core Data.
I am trying to fill a Line Chart with saved weight data like below:
LineView(data: [0, 32, 445, 56, 99])
I’ve gotten as far as this but im getting an error on the "var locations = ..." line saying "Type of expression is ambiguous without more context"
var fetchRequest = NSFetchRequest<NSFetchRequestResult>(entityName: "UserWeight")
var locations = mocW.executeFetchRequest(fetchRequest, error: nil) as [UserWeight]
for weight in weights {
print(weights.userWeight)
}
Any help on this and how i would populate the line chart with this data would be greatly appreciated!
For SwiftUI, I suspect that you are attempting to achieve the following...
struct YourView: View {
#FetchRequest(entity: UserWeight.entity(),
sortDescriptors: []
) var weights: FetchedResults<UserWeight>
var body: some View {
ForEach(weights) { weight in
Text(weight.userWeight)
}
}
}
Core Data entities confirm to the Identifiable protocol, so you'e able to drop the id: parameter in the ForEach structure...
ForEach(weights) { weight in
Otherwise you'd need to use...
ForEach(weights, id: \.self) { weight in
Note: As an aside, it would help us if you could provide more detail in your questions in the future. The more information you provide, the easier it is for the community to understand your issue and provide a suitable response. Remember that your question and our answers may not only help you, but also help others in the future as they visit the site looking for answers to their own problems.
How do I ask a good question?
if let appDelegate =
UIApplication.shared.delegate as? AppDelegate {
let managedObjectContext = appDelegate.persistentContainer.viewContext
let fetchRequest = NSFetchRequest<Memory>(entityName: "Memory")
let sortDescriptor = NSSortDescriptor(key: "rating", ascending: false)
var predicate = NSPredicate(format: "mediaType == %#", "image")
fetchRequest.predicate = predicate
fetchRequest.sortDescriptors = [sortDescriptor]
do {
result = try managedObjectContext.fetch(fetchRequest)
} catch {
}
}
"result" is an array of, in my case, Memory objects which are instances of NSManagedObject. To access properties and populate views I do this:
for memory in result {
let value = memory.entityPropertyName
}
I think this should be enough to get your started, let me know if you have more questions.
If UserWeight is a subclass of NSManagedObject, you should declare your fetch request as
var fetchRequest = NSFetchRequest<UserWeight>(entityName: "UserWeight")
Or else as
let fetchRequest: NSFetchRequest<UserWeight> = UserWeight.fetchRequest()
Then you can use the fetch like this, and the type of locations will be Array<UserWeight>.
let locations = try context.fetch(fetchRequest)
I'm not sure where executeFetchRequest(fetchRequest, error: nil) comes from-- it's not a function defined by NSManagedObjectContext in Swift. It resembles the Objective-C version of the function, but in Swift it's different.

how to loop AvAudioPlayerNode in Xcode10

Hi I am making an iOS game in SpriteKit and decided to use an AvAudioPlayerNode and engine instead off the regular AvAudioPlayer so I could change the pitch and playbackspeed but I have come across a problem in which the AvAudioPlayerNode doesn't repeat after being played
I have already used many answers from stack overflow but they are very out dated if you do happen to solve this can u make sure it works with my code otherwise its useless
//
let engine = AVAudioEngine()
let speedControl = AVAudioUnitVarispeed()
let pitchControl = AVAudioUnitTimePitch()
let audioplayer = AVAudioPlayerNode()
let audioPath = Bundle.main.path(forResource: "loop", ofType: "wav")
func play(_ url: URL) throws {
let file = try AVAudioFile(forReading: url)
engine.attach(audioplayer)
engine.attach(pitchControl)
engine.attach(speedControl)
engine.connect(audioplayer, to: speedControl, format: nil)
engine.connect(speedControl, to: pitchControl, format: nil)
engine.connect(pitchControl, to: engine.mainMixerNode, format: nil)
audioplayer.scheduleFile(file, at: nil)
try engine.start()
audioplayer.play()
}
'
//This simply gets run in the game view controller by doing this
let audioPath = Bundle.main.path(forResource: "loop", ofType: "wav") ?? ""
do
{
try? play(URL(fileURLWithPath: audioPath))
}
catch {
}
I actually have no idea where to start I am fairly new to Xcode so any help would be useful, please don't over complicate things I am 15
You want to read your file into a buffer, then schedule the player to loop
let file = try AVAudioFile(forReading: url)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: file.fileFormat, frameCapacity: file.length)
try? read(into buffer: audioFileBuffer )
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
audioFilePlayer.play()

How do I fix a random amount of for-loop?

I am creating a game (space shooter) using SpriteKit. Currently adding animations to the game. When explosions spawn, the animation loops at an X number of times (its really random). Sometimes it will loop 3 times and sometimes up to 10 times. The screen ends up being filled up with meaningless explosion animations.
I used to have a simple fade in/fade out animation which was working fine, but have finally upgraded to something smoother. I introduced a for loop and it has given me this issue. i also tried using a while loop with no avail. I have tried using the animation without a sequence but that doesn't fix anything either.
func spawnExplosion(spawnPosition: CGPoint) {
var explosion = SKSpriteNode()
textureAtlas = SKTextureAtlas(named: "explosion")
for i in 1...textureAtlas.textureNames.count {
let name = "explosion\(i).png"
textureArray.append(SKTexture(imageNamed: name))
print(i)
}
explosion = SKSpriteNode(imageNamed: "explosion1.png" )
explosion.setScale(0.6)
explosion.position = spawnPosition
explosion.zPosition = 3
self.addChild(explosion)
print(textureAtlas.textureNames.count)
//explosion animation-action
let explosionAnimation = SKAction.repeat(SKAction.animate(with: textureArray, timePerFrame: 0.05), count: 1)
let delete = SKAction.removeFromParent()
let explosionSequence = SKAction.sequence([explosionSound, explosionAnimation, delete])
explosion.run(explosionSequence)
}
The expected result is, when the function is called, the animation should run through ONCE and delete itself. Instead, it runs once up to 10 or so times.
Thanks to #Knight0fDragon, I was able to fix this issue by making the texture array local within the function. Now each explosion has it's own instance.
func spawnExplosion(spawnPosition: CGPoint) {
var explosion = SKSpriteNode()
var textureAtlas = SKTextureAtlas()
var textureArray = [SKTexture]()
textureAtlas = SKTextureAtlas(named: "explosion")
for i in 1...textureAtlas.textureNames.count {
let name = "explosion\(i).png"
textureArray.append(SKTexture(imageNamed: name))
}
explosion = SKSpriteNode(imageNamed: "explosion1.png" )
explosion.setScale(0.6)
explosion.position = spawnPosition
explosion.zPosition = 3
self.addChild(explosion)
//explosion animation-action
let explosionAnimation = SKAction.repeat(SKAction.animate(with: textureArray, timePerFrame: 0.05), count: 1)
let delete = SKAction.removeFromParent()
let explosionSequence = SKAction.sequence([explosionSound, explosionAnimation, delete])
explosion.run(explosionSequence)
}

Invalid output filetype

I am trying to output a wav file from the below function, however, during runtime, I get the error "Invalid output filetype". I am confused as to why AVFileType.wav doesn't work, I tested AVFileType.m4a and it works for some reason. Cheers!
func createSound(soundFiles: [String], outputFile: String) {
var startTime: CMTime = kCMTimeZero
let composition: AVMutableComposition = AVMutableComposition()
let compositionAudioTrack: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)!
for fileName in soundFiles {
let sound: String = fileName
let url: URL = URL(fileURLWithPath: sound)
let avAsset: AVURLAsset = AVURLAsset(url: url)
let timeRange: CMTimeRange = CMTimeRangeMake(kCMTimeZero, avAsset.duration)
let audioTrack: AVAssetTrack = avAsset.tracks(withMediaType: AVMediaType.audio)[0]
try! compositionAudioTrack.insertTimeRange(timeRange, of: audioTrack, at: startTime)
startTime = CMTimeAdd(startTime, timeRange.duration)
}
let exportPath: String = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].path+"/"+outputFile+".wav"
let export: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)!
export.outputURL = URL(fileURLWithPath: exportPath)
export.outputFileType = AVFileType.wav
export.exportAsynchronously {
if export.status == AVAssetExportSessionStatus.completed {
NSLog("All done");
print(export.outputURL)
}
}
}
The preset AVAssetExportPresetAppleM4A is for creating m4a files, not wav files. For wav, try the passthrough preset: AVAssetExportPresetPassthrough, and delete an files leftover from previous runs:
let export: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)!
export.outputURL = URL(fileURLWithPath: exportPath)
export.outputFileType = .wav
try? FileManager.default.removeItem(at: export.outputURL!) // otherwise export can fail :(
export.exportAsynchronously {
// etc
}
NB: with AVAssetExportPresetPassthrough, your input audio files will probably need to be wavs too, and maybe even have the same format. If that's no good for you, then try one of the AVAssetWriter, AVAudioFile or ExtAudioFile APIs.
p.s. if you're curious about which AVAssetExportSession preset, AVAsset/AVAssetComposition and AVFileType combinations are supported, you can use the determineCompatibility function:
AVAssetExportSession.determineCompatibility(ofExportPreset: AVAssetExportPresetPassthrough, with: composition, outputFileType: .wav) { ok in
print("COMPUTER SAYS \(ok)")
}
although it's basically a dry run export, minus the file-url-already-exists check and the occasionally-useful Error turned into an unhelpful boolean, so I guess it doesn't add much value at all.

Swift array in dictionary leads to NSCFArray

I prepare a swift Array in my Watch Interface and send it to the iOS App:
#IBAction func buttonGeklickt() {
if WCSession.isSupported() {
let session = WCSession.defaultSession()
session.delegate = self
session.activateSession()
let dateFormatter = NSDateFormatter()
dateFormatter.dateFormat = "hh:mm"
let datumString = dateFormatter.stringFromDate(NSDate())
var swiftArray = [String]()
swiftArray.append(datumString)
var swiftDict = ["a":swiftArray]
session.transferUserInfo(swiftDict)
}
so far so good, on the iOS App the dictionary arrives, but there seems to be something wrong with the Array in the Dictionary:
func session(session: WCSession, didReceiveUserInfo userInfo: [String : AnyObject]) {
print ("seems to be the same Dict = \(userInfo)")
if let vw = userInfo["a"] as? [String: String] {
print ("Never called! Here I would expect my array from the watch \(vw)")
}
}
I would expect and like vw to hold the same array as swiftArray in the watchApp. However it seems to be of type __NSCFArray:
screenshot
So what I'm doing wrong here?
I'm new to Swift, however I'm experienced with Objective C to solve actually every problem I faced in the past years, but this issue seems to be so basic and it's embarrassing that I'm not able to solve it on my own. So help is much appreciated
If I understand your code correctly, you are saving "a" as value of type [String]. But you are trying to read it as [String:String]. Instead of
if let vw = userInfo["a"] as? [String: String]
try
if let vw = userInfo["a"] as? [String]

Resources