ImageView inside of ScrollView showing wrong picture - arrays

I have a scroll view that has an image view embedded into it. The user clicks the "Take Photo" button, takes a picture, and then those photos are stored in an array and then displayed on the scrollable imageView. My issue is that after the didFinishSelectingMedia is called and the camera closes, the imageView always shows the first image in the array. I want the imageView to show the most recent image added to the array and scroll backward through the images until the user comes to the first image in the array. How do I make this happen?
Btw: Idk if this is even plausible, but I tried reversing the for loop and that didn't work.
I'm a new programmer, so please explain your answers.
Here's my code:
import UIKit
class ViewController: UIViewController, UINavigationControllerDelegate, UIImagePickerControllerDelegate {
#IBOutlet weak var scrollImageView: UIScrollView!
var imageTaken: [UIImage] = []
var imagePicker = UIImagePickerController()
override func viewDidLoad() {
super.viewDidLoad()
scrollImageView.frame = scrollImageView.frame
}
#IBAction func takePhotoButton(_ sender: Any) {
imagePicker = UIImagePickerController()
imagePicker.delegate = self
imagePicker.sourceType = .camera
present(imagePicker, animated: true, completion: nil)
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
imageTaken.append((info[.originalImage] as? UIImage)!)
imagePicker.dismiss(animated: true, completion: nil)
for i in 0..<imageTaken.count {
let imageView = UIImageView()
imageView.image = imageTaken[i]
imageView.contentMode = .scaleAspectFit
let xPosition = self.scrollImageView.frame.width * CGFloat(i)
imageView.frame = CGRect(x: xPosition, y: 0, width: self.scrollImageView.frame.width, height: self.scrollImageView.frame.height)
scrollImageView.contentSize.width = scrollImageView.frame.width * CGFloat(i + 1)
scrollImageView.addSubview(imageView)
}
}
}

Two options I guess you can take which ever one is seems easier.
OOP — Create an array of objects example:
struct TakePhotos {
var image : UIImage
var timestamp : Date
}
Then sort the array by the timestamp.
Use PHAsset
import Photos
Using PHAsset you should be able to retrieve the images taken from the user's album. Set a count when the user takes a new image.
Then you can run a query and sort the images in reverse order, they will already have the timestamp.
let photoOptions = PHFetchOptions()
photoOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
This is what I did in my application.
// FIXME: - Query only happens once this needs to be refactored so it happens when the album is updated on the device.
// Query only happens one time.
if imageArray.isEmpty {
let photoOptions = PHFetchOptions()
photoOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
let allPhotos = PHAsset.fetchAssets(with: photoOptions)
let fetchOptions = PHImageRequestOptions()
fetchOptions.deliveryMode = .highQualityFormat
fetchOptions.resizeMode = .exact
fetchOptions.normalizedCropRect = CGRect(x: 0, y: 0, width: view.frame.width, height: view.frame.width)
// Must be a synchronous call otherwise the view loads before images are retrieved (result is an empty view).
fetchOptions.isSynchronous = true
let imageSize : CGSize = CGSize(width: view.frame.width, height: view.frame.height)
//Retrieves Images using assets.
allPhotos.enumerateObjects { (assets, index, pointer) in
DispatchQueue.global(qos: .background).sync {
PHImageManager.default().requestImage(for: assets, targetSize: imageSize, contentMode: .aspectFit, options: fetchOptions) { (image, hashDictionary) in
guard let image = image else {return}
self.imageArray.append(image)
self.albumImages.append(ImageAlbum(images: image))
}
}
}
selectedImage.image = imageArray.first
selectedImage.contentMode = .scaleAspectFill
}

Related

UIImage isn't showing up in UIScrollView

Im grabbing the photos from library using a cocoa pod called BSImagePicker. The Images are then stored into a photoArray. I want to take those images from the array and present them in an image view that's held inside the scrollView. However the images aren't showing up at all.
This is what I've currently tried.
This is the code that the cocoa pod uses to store images into the array:
func openImagePicker() {
let vc = BSImagePickerViewController()
self.bs_presentImagePickerController(
vc,
animated: true,
select: { (assest: PHAsset) -> Void in },
deselect: { (assest: PHAsset) -> Void in },
cancel: { (assest: [PHAsset]) -> Void in },
finish: { (assest: [PHAsset]) -> Void in
for i in 0..<assest.count {
self.SelectedAssets.append(assest[i])
}
self.convertAssetToImages()
},
completion: nil
)
print(photoArray)
setupImages(photoArray)
}
this is the code that converts asset to image
extension addImages {
func convertAssetToImages() -> Void {
if SelectedAssets.count != 0{
for i in 0..<SelectedAssets.count{
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
var thumbnail = UIImage()
option.isSynchronous = true
let width = scrollView.frame.width
let height = scrollView.frame.height
manager.requestImage(for: SelectedAssets[i], targetSize: CGSize(width: width, height: height), contentMode: .aspectFill, options: option, resultHandler: {(result,info) -> Void in
thumbnail = result!
})
let data = thumbnail.jpegData(compressionQuality: 1)
let newImage = UIImage(data: data! as Data)
self.photoArray.append(newImage! as UIImage)
}
}
}
}
this code takes image from photo array into scrollview
func setupImages(_ images: [UIImage]){
for a in 0..<photoArray.count {
let theImage = UIImageView()
theImage.image = self.photoArray[a]
let xPosition = UIScreen.main.bounds.width * CGFloat(a)
self.scrollView.frame = CGRect(x: xPosition, y: 0, width: self.scrollView.frame.width, height: self.scrollView.frame.height)
theImage.contentMode = .scaleAspectFit
self.scrollView.contentSize.width = self.scrollView.frame.width * CGFloat(a + 1)
self.scrollView.addSubview(theImage)
}
}
I have no idea why it isn't working.

Add UIImage to back View

Video
Hey :) Im kind of new to Stack and not sure if i should have edited this instead of adding a whole new question but i figured out a way to make it more clear. So in my code i want my textImages to only show up on the front of the UIimageViews, and the "back" image to show up on only the back side of my ImageView when flipped. Right now both Images are showing up on both parts. Is there a way to addSubview() to only the back of the ImagView :/ I do not want to use UItablevView if possible. Also in the video the last looped UIimageView works just like i want but not the ImageViews before it.
import UIKit
import Foundation
class ViewController: UIViewController {
#IBOutlet weak var mainScrollView: UIScrollView!
#IBOutlet var open: UIBarButtonItem!
var myImageArray = [UIImage]()
var back: UIImageView!
var front: UIImageView!
var showingFront = true
override func viewDidLoad() {
super.viewDidLoad()
let singleTap = UITapGestureRecognizer(target: self, action: #selector(ViewController.tapped))
singleTap.numberOfTapsRequired = 1
mainScrollView.addGestureRecognizer(singleTap)
mainScrollView.isUserInteractionEnabled = true
mainScrollView.frame = view.frame
myImageArray = [textImg1, textImg2 ,textImg3, textImg4, textImg5]
for i in 0..<myImageArray.count {
front = UIImageView()
front.image = myImageArray[i]
front.contentMode = .scaleToFill
let yPosition = self.view.frame.height * CGFloat(i) + self.view.frame.height/2 - (self.view.frame.height / 1.1)/2
let xPosition = self.view.frame.width/2 - (self.view.frame.width / 1.1)/2
front.frame = CGRect(x: xPosition, y: yPosition, width: self.view.frame.width / 1.1, height: self.view.frame.height / 1.1)
front.layer.borderWidth = 5
mainScrollView.contentSize.height = mainScrollView.frame.height * CGFloat(i + 1)
mainScrollView.addSubview(front)
back = UIImageView(image: UIImage(named: "back.png"))
back.contentMode = .scaleToFill
back.frame = CGRect(x: xPosition, y: yPosition, width: self.view.frame.width / 1.1, height: self.view.frame.height / 1.1)
back.layer.borderWidth = 5
mainScrollView.addSubview(back)
}
}
func tapped() {
print("Hello")
if (showingFront) {
UIView.transition(from: front, to: back, duration: 1, options: UIViewAnimationOptions.transitionFlipFromRight, completion: nil)
showingFront = false
} else {
UIView.transition(from: back, to: front, duration: 1, options: UIViewAnimationOptions.transitionFlipFromLeft, completion: nil)
showingFront = true
}
}
}

UICollectionView not displaying all of my array

I currently have a custom UICollection which loads a users video library from their camera roll. Now I am currently able to add all the videos into an array; and it prints out the correct count of videos; however my UICollection is not displaying all of my videos in my library (which amounts to 119). Anyone have any clue why this would be occurring?
Here is my code:
struct Media {
var image:UIImage?
var videoURL:NSURL?
}
var mediaArray = [Media]()
func grabPhotos(){
let imgManager = PHImageManager.default()
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = true
requestOptions.deliveryMode = .highQualityFormat
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
if let fetchResult : PHFetchResult = PHAsset.fetchAssets(with: .video, options: fetchOptions) {
if fetchResult.count > 0 {
for i in 0..<fetchResult.count{
var mediaItem = Media()
//Used for fetch Image//
imgManager.requestImage(for: fetchResult.object(at: i) as PHAsset , targetSize: CGSize(width: 400, height: 400), contentMode: .aspectFit, options: requestOptions, resultHandler: {
image, error in
let imageOfVideo = image! as UIImage
mediaItem.image = imageOfVideo;
//Used for fetch Video//
imgManager.requestAVAsset(forVideo: fetchResult.object(at: i) as PHAsset, options: PHVideoRequestOptions(), resultHandler: {(avAsset, audioMix, info) -> Void in
if let asset = avAsset as? AVURLAsset {
let videoData = NSURL(string: "\(asset.url)")
let duration : CMTime = asset.duration
let durationInSecond = CMTimeGetSeconds(duration)
print(durationInSecond)
mediaItem.videoURL = videoData!
self.mediaArray.append(mediaItem)
print(self.mediaArray.count)
}
})
})
}
}
else{
//showAllertToImportImage()//A function to show alert
}
}
}
And my cellForItemAt
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell
{
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: cellId, for: indexPath) as! VideoSelectionCVCell
cell.uploadedFile.image = mediaArray[indexPath.row].image
return cell
}
& Within my viewWillAppear I have the following creating the UICollection:
let flowLayout = UICollectionViewFlowLayout()
let collectionView = UICollectionView(frame: self.view.bounds, collectionViewLayout: flowLayout)
collectionView.register(VideoSelectionCVCell.self, forCellWithReuseIdentifier: cellId)
collectionView.delegate = self
collectionView.dataSource = self
collectionView.backgroundColor = UIColor.clear
self.view.addSubview(collectionView)
I think what is occurring is the screen is loading before the grabPhotos() occurs; and the grabPhotos() doesn't finish until the after the screen is loaded. I also have my UICollection being created in the viewWillAppear, so that would make it a private occurrence (if I'm correct). So I guess to fix this, I would need to make the UICollectionView public, but how would I do that if I am doing it programmatically + creating it in my View Will Appear?
There are a couple different ways you can solve this I think.
Move your remote image loading into cellforitemat
Add your collection view in Viewdidload, then at the end of the function call grabphotos.
In your grabphotos functions, call collectionView.reloadData() after you have the fetchResult.
Move imgManager.requestImage into cellforitem. This way you are only loading each image as the cells are rendered. The user isn't waiting for all the images to load before the collectionView is updated. You can add a prefetch if you are concerned about performance.
Use a DispatchGroup
If you really really want to load all the images before updating the collectionView, you can create a DispatchGroup to track the image downloads and then update the collectionView after it's all done.
struct Media {
var image:UIImage?
var videoURL:NSURL?
}
var mediaArray = [Media]()
let loadContentGroup = DispatchGroup()
func grabPhotos(){
loadContentGroup.notify(queue: DispatchQueue.main) { [weak self] in
guard let `self` = self else { return }
self.collectionView.reloadData()
}
let imgManager = PHImageManager.default()
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = true
requestOptions.deliveryMode = .highQualityFormat
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
if let fetchResult : PHFetchResult = PHAsset.fetchAssets(with: .video, options: fetchOptions) {
if fetchResult.count > 0 {
for i in 0..<fetchResult.count{
var mediaItem = Media()
loadContentGroup.enter()
//Used for fetch Image//
imgManager.requestImage(for: fetchResult.object(at: i) as PHAsset , targetSize: CGSize(width: 400, height: 400), contentMode: .aspectFit, options: requestOptions, resultHandler: {
image, error in
let imageOfVideo = image! as UIImage
mediaItem.image = imageOfVideo;
//Used for fetch Video//
imgManager.requestAVAsset(forVideo: fetchResult.object(at: i) as PHAsset, options: PHVideoRequestOptions(), resultHandler: {(avAsset, audioMix, info) -> Void in
if let asset = avAsset as? AVURLAsset {
let videoData = NSURL(string: "\(asset.url)")
let duration : CMTime = asset.duration
let durationInSecond = CMTimeGetSeconds(duration)
print(durationInSecond)
mediaItem.videoURL = videoData!
self.mediaArray.append(mediaItem)
print(self.mediaArray.count)
self.loadContentGroup.leave()
}
})
})
}
}
else{
//showAllertToImportImage()//A function to show alert
}
}
}

Issues with transparent PNG in UIActivityViewController for FB Messenger and iMessage

I have been using the following code to call the UIActivityViewController for sharing stickers via the various social media apps:
if let image = sticker.getUIImage(), let imgData = UIImagePNGRepresentation(image) {
let activityVC = UIActivityViewController(activityItems: [imgData], applicationActivities: nil)
activityVC.popoverPresentationController?.sourceView = gesture.view
self.present(activityVC, animated: true, completion: nil)
} else {
ErrorHandler.handleError(STICKER_IMAGE_NOT_FOUND, sticker)
}
This code has been working fine until the most recent update to FB messenger (version 98.0). Now it shows an error "Couldn't load content". FB messenger appears to prefer a URL like this:
if let item = sticker.getImageURL() {
let activityVC = UIActivityViewController(activityItems: [item], applicationActivities: nil)
activityVC.popoverPresentationController?.sourceView = gesture.view
self.present(activityVC, animated: true, completion: nil)
} else {
ErrorHandler.handleError(STICKER_IMAGE_NOT_FOUND, sticker)
}
This works fine with FB Messenger but iMessage displays the transparent PNG with a black background.
I was looking at UIActivityViewControllerCompletionWithItemsHandler but the discussion states it runs after the activity, too late for what I need to do. I also tried creating a custom UIActivity returning UIActivityType.message for activityType but it was added on to the bottom of the controller rather than taking over the default.
Is there a way to intercept the selection of the item in UIActivityViewController so I can use the MFMessageComposeViewController and add the UIImagePNGRepresentation to the message and allow all the others to use the URL?
Is there a particular argument type that I can pass to UIActivityViewController that will correctly display transparent PNG with all the social apps?
TIA
Mike
Circling back to close this up. I eventually switched from using the UIActivityViewController universally to a system that customizes what is done for each type of service. I use BDGShare by Bob de Graaf. I build a list of the services that the app supports, show a button for each and then a switch to jump to each type of share. Just in case someone's working on this, here's what I came up with:
The types of sharing the app wants to support:
public enum ShareServiceType:Int {
case iMessage=0, facebook, twitter, instagram, whatsApp, facebookMessenger, email, more
}
A class to store information about the type of sharing service
public class ShareTargetVO: NSObject
{
var icon:String!
var label:String!
var type:ShareServiceType
init( serviceType:ShareServiceType, icon:String, label:String )
{
self.type = serviceType
self.icon = icon
self.label = label
}
}
A function in my social networking helper to populate the list of available services:
static func getShareTargetList(for sticker : ReeSticker) -> [ShareTargetVO] {
var services:Array<ShareTargetVO> = []
// Check to see which capabilities are present and add buttons
if (BDGShare.shared().isAvailable(forServiceType: SLServiceTypeFacebook)) {
services.append(ShareTargetVO(serviceType: ShareServiceType.facebook, icon: "Icon-Share-Facebook", label: "Facebook"))
}
// Checking for facebook service type because there's no availability check with FBSDK for messenger (could be rolled into the lib)
if (BDGShare.shared().isAvailable(forServiceType: SLServiceTypeFacebook) && !sticker.type.doesContain("video")) {
services.append(ShareTargetVO(serviceType: ShareServiceType.facebookMessenger, icon: "Icon-Share-Messenger", label: "Messenger"))
}
if (BDGShare.shared().isAvailable(forServiceType: SLServiceTypeTwitter)) {
services.append(ShareTargetVO(serviceType: ShareServiceType.twitter, icon: "Icon-Share-Twitter", label: "Twitter"))
}
if (BDGShare.shared().canSendSMS()) {
services.append(ShareTargetVO(serviceType: ShareServiceType.iMessage, icon: "Icon-Share-Messages", label: "Messages"))
}
if UIApplication.shared.canOpenURL(URL(string: "whatsapp://")! as URL) && !sticker.type.contains("video") {
services.append(ShareTargetVO(serviceType: ShareServiceType.whatsApp, icon: "Icon-Share-Whatsapp", label: "What's App?"))
}
if UIApplication.shared.canOpenURL(URL(string: "instagram://app")! as URL) && !sticker.type.contains("video") {
services.append(ShareTargetVO(serviceType: ShareServiceType.instagram, icon: "Icon-Share-Instagram", label: "Instagram"))
}
if (BDGShare.shared().canSendEmail()) {
services.append(ShareTargetVO(serviceType: ShareServiceType.email, icon: "Icon-Share-Mail", label: "Email"))
}
services.append(ShareTargetVO(serviceType: ShareServiceType.more, icon: "Icon-Share-More", label: "More"))
return services
}
A function in my view controller to populate a UICollectionView of buttons for sharing limited to those services that are returned from the list function:
func layoutShareButtons() {
let f = self.view.frame
let btnWidth = f.width * 0.82
let bannerWidth = btnWidth + 10
mask = UIView(frame: CGRect(x: 0, y: 0, width: f.width, height: f.height))
mask.backgroundColor = .black
mask.alpha = 0.3
self.view.addSubview(mask)
buttonList = SocialHelper.getShareTargetList(for: self.sticker)
let buttonGridLayout = UICollectionViewFlowLayout()
buttonGridLayout.sectionInset = UIEdgeInsets(top: 5, left: 5, bottom: 5, right: 5)
buttonGridLayout.itemSize = CGSize(width: 60, height: 60)
buttonGridLayout.scrollDirection = .horizontal
buttonListView = UICollectionView(frame: CGRect(x: (f.width - bannerWidth) / 2,
y: self.preview.frame.origin.y + self.preview.frame.height + 10,
width: bannerWidth,
height: 80),
collectionViewLayout: buttonGridLayout)
buttonListView.register(ShareButtonCell.self, forCellWithReuseIdentifier: "shareButtonCell")
buttonListView.dataSource = self
buttonListView.delegate = self
self.view.addSubview(buttonListView)
// cancel button for sharing view
// Button (added last to ensure it's on top of the z-order)
cancelButton = SimpleButton(frame: CGRect(x: (f.width - bannerWidth) / 2, y: self.buttonListView.frame.origin.y + self.buttonListView.frame.height + 10, width: bannerWidth, height: 52))
cancelButton.backgroundColor = UIColor(netHex:0x202020)
cancelButton.layoutComponent(0, label: "Cancel")
cancelButton.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(self.cancelButtonPressed(_:))))
self.view.addSubview(self.cancelButton)
}
UICollectionView didSelect handler (for better SoC depending on your app remove the "share" function to a separate class just for the share implementation, in this app the screen we're working with is specifically for share):
func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
self.share(shareTarget: buttonList[indexPath.row])
}
Finally, a function call the correct share type:
func share(shareTarget: ShareTargetVO) {
// Params to submit to service
self.shareUrl = self.sticker.getStickerURL()
let textStr = "" // BDGShare supports a message passed in as well but we just send the sticker
// we need the NSData either way (sticker / video)
var ok = true
// try? is fine here because result is tested below
let urlData : Data? = try? Data(contentsOf: self.shareUrl as URL)
ok = (urlData != nil)
var img: UIImage? = nil
// if it's an image type then get it
if ok {
if (self.shareUrl.pathExtension.contains("png")) || (self.shareUrl.pathExtension.contains("jpg")) {
img = UIImage(data: urlData! as Data)
ok = (img != nil)
}
}
if !ok {
let alertCtrl = UIAlertController(title: "Error sending", message: "There was an error gathering the information for sending this sticker.", preferredStyle: .alert)
alertCtrl.addAction(UIAlertAction(title: "OK", style: UIAlertActionStyle.default, handler: nil))
self.present(alertCtrl, animated: true, completion: nil)
return
}
switch shareTarget.type
{
case .iMessage:
BDGShare.shared().shareSMS(textStr, recipient: nil, image: img, data:urlData! as Data, imageName: "sendSticker.png", completion: {(SharingResult) -> Void in
// Handle share result...
self.handleShareResult(shareTarget.type, shareResult: SharingResult)
})
break
case .facebook:
BDGShare.shared().shareFacebook("", urlStr: "", image: img, completion: {(SharingResult) -> Void in
// Handle share result...
self.handleShareResult(shareTarget.type, shareResult: SharingResult)
})
break
case .twitter:
... more code for handling each type of share
}
}
So there are all the pieces I used to implement using BDGShare to get around the UIActivityViewController. BTW - the "More" option at the end of the list calls that UIActivityViewController so it's still available to the user if they so choose.
HTH, Mike

How to create a copy of UIView objects, and decrease the font size of a particular label, where views are made programmatically, in Swift?

I have created views programmatically on sub views.
let view1 = UIView(frame: CGRectMake(30, cgfloat , 270, 120))
view1.backgroundColor = UIColor.whiteColor()
view1.layer.cornerRadius = 40.0
let Attdate: UILabel = UILabel()
Attdate.frame = CGRectMake(15, 10, 100, 25)
Attdate.backgroundColor = UIColor.orangeColor()
Attdate.textColor = UIColor.blackColor()
Attdate.font = UIFont(name: "BebasNeue", size: 106)
Attdate.textAlignment = NSTextAlignment.Left
Attdate.text = AttDate.description
view1.addSubview(Attdate)
I have an array as a response string from server which gives me an array of data. And I want to print that data into the labels. The labels should get called dynamically as the per the array length. And thats THE reason, I am trying to duplicate view1(my UIView object). I tried nskeyedarchiver(not sure how it will help).
extension UIView{
{
func copyView() -> AnyObject
{
return NSKeyedUnarchiver.unarchiveObjectWithData(NSKeyedArchiver.archivedDataWithRootObject(self))!
}
}
And declared:
let view1 = UIView()
let copiedView = view1.copyView() as! UIView
print("CopiedView:\(copiedView)")
But, no luck :(
Also, I tried many syntaxes to decrease the font size of a particular label, but none seemed to work.
Kindly reply.
To decrease the font size of a particular label, there are two options:
You can make the font size part of an initializer, so you can set it every time you have to create one, like this:
.
class CustomView: UIView {
init(fontSizeOfLabel : CGFloat) {
super.init(frame: CGRect(x: 30, y: 20, width: 270, height: 120))
// all your normal code in here
let Attdate: UILabel = UILabel()
Attdate.font = UIFont(name: "BebasNeue", size: fontSizeOfLabel)
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
Or you can loop over its subviews and change it after you have created it:
.
var instance = CustomView()
for subview in instance.subviews {
if var label = subview as? UILabel { //check if it can convert the subview into a UILabel, if it can it is your label
label.font = UIFont(name: "BebasNeue", size: 70)
}
}
I haven't tested the second solution, I have checked the first one

Resources