Swift: optional array count - arrays

In Objective-C, if I had the following property:
#property (strong, nonatomic) NSArray * myArray;
A method to return a number of objects in myArray would look like:
- (NSInteger) numberOfObjectsInMyArray
{
return [self.myArray count];
}
This would return either the number of objects in the array, or 0 if myArray == nil;
The best equivalent I can think of for doing this in Swift is:
var myArray: Array<String>?
func numberOfObjectsInMyArray() -> Int
{
return myArray ? myArray!.count : 0
}
So checking the optional array contains a value, and if so unwrap the array and return that value, otherwise return 0.
Is this the correct way to do this? Or is there something simpler?

Try using the nil coalescing operator.
According to the Apple Documentation:
The nil coalescing operator (a ?? b) unwraps an optional a if it contains a value, or returns a default value b if a is nil.
So your function could look like this:
func numberOfObjectsInMyArray() -> Int {
return (myArray?.count ?? 0)
}
I agree with others that this could be a bad idea for a number of reasons (like making it look like there is an array with a count of "0" when there isn't actually an array at all) but hey, even bad ideas need an implementation.
EDIT:
So I'm adding this because two minutes after I posted this answer, I came across a reason for doing exactly what the author wants to do.
I am implementing the NSOutlineViewDataSource protocol in Swift. One of the functions required by the protocol is:
optional func outlineView(_ outlineView: NSOutlineView,
numberOfChildrenOfItem item: AnyObject?) -> Int
That function requires that you return the number of children of the item parameter. In my code, if the item has any children, they will be stored in an array, var children: [Person]?
I don't initialize that array until I actually add a child to the array.
In other words, at the time that I am providing data to the NSOutlineView, children could be nil or it could be populated, or it could have once been populated but subsequently had all objects removed from it, in which case it won't be nil but it's count will be 0. NSOutlineView doesn't care if children is nil - all it wants to know is how many rows it will need to display the item's children.
So, it makes perfect sense in this situation to return 0 if children is nil. The only reason for calling the function is to determine how many rows NSOutlineView will need. It doesn't care whether the answer is 0 because children is nil or because it is empty.
return (children?.count ?? 0) will do what I need. If children is nil it will return 0. Otherwise it will return count. Perfect!

That looks like the simpler way.
The Objective-C code is shorter only because nil is also a form of 0, being a C-based language.
Since swift is strongly typed you don't have such a shorthand. In this specific case it requires a little more effort, but in general it saves you most of the headaches caused by loose typing.
Concerning the specific case, is there a reason for making the array optional in the first place? You could just have an empty array. Something like this might work for you:
var myArray: Array<String> = []
func numberOfObjectsInMyArray() -> Int {
return myArray.count
}
(Source for this information)

How about using optional for return value?
var myArray: Array<String>?
func numberOfObjectsInMyArray() -> Int? {
return myArray?.count
}
I think that this way is safer.
(Source for this information)

Related

Beginner Question: Func return String and Array

I'm experimenting manipulating basic Swift data types and tried to add a (basic) layer of complexity to a simple function examples I was practising from Swifts Documentation. I was able to return as a simple array but when I tried to return an array within a String I got errors. I've read all the Swift Documentation for Arrays to try and solve this first with no luck.
Here is my code for the successful Array return:
func namesList(person: String) -> [(String)] {
let register = ["RoboCop", person, "Terminator"]
return register.sorted()
}
... and my unsuccessful code:
func namesList(person: String) -> [(String)] {
let register = "The alphabetic order of names are \(["RoboCop", person, "Terminator"])"
return register.sorted()
}
I think the problem is in my return parameters but couldn't find a way to Return string and array?
Many thanks
.sorted() operates on Sequence, and both Array and String are Sequences. [String].sorted (your first example) return a [String]. String.sorted (your second example) returns a [Character], since a String is made up of Characters.
Your second example is not "an array nested in a String." It's just a String. \(...) performs string interpolation. It doesn't nest anything.
If you want to return the String, you'll need to return String.
func namesList(person: String) -> String {
let register = ["RoboCop", person, "Terminator"].sorted()
return "The alphabetic order of names are \(register)"
}
As a very minor point, [(String)] is unusual syntax and can be confusing to experienced Swift devs. The correct way to write this is just [String] with no extra parentheses.
register variable is actually a String in the second case.
You should change the return type to just string.

Duplicate Int in Array , Dictionary or Set in SWIFT

Reading up on Sets and Arrays I find that a Set cannot, or is not able to store duplicate values ( Ints, Strings, etc ).
Knowing this, if we are to solve for finding a duplicate Int in an array and one method is to convert the Array to a Set, how come we don't get an error once the Array is a Set?
The methods below simply return a Bool value if the array contains duplicates.
import UIKit
func containsDuplicatesDictionary(a: [Int]) -> Bool {
var aDict = [Int : Int]()
for value in a {
if let count = aDict[value] {
aDict[value] = count + 1
return true
} else {
aDict[value] = 1
}
}
return false
}
containsDuplicatesDictionary(a: [1,2,2,4,5])
func containsDuplicatesSet(a: [Int]) -> Bool {
return Set(a).count != a.count
}
containsDuplicatesSet(a: [1,2,2,4])
The first function, containsDuplicatesDictionary, I convert the array to a Dictionary, of course this takes a for loop as well. The Set method can be done in one line, which is really nice. But I guess since I am new to this, I would think converting the array would throw an error immediately since theres duplicate values.
What am I missing when it's converted
Thank you.
Set, by design is an unordered, unique collection of elements. The implementation of Set takes care of duplicate values itself, when you try to add a duplicate value, it checks whether the value is already present in the Set or not and if it is, the value is not added.
When you call the initializer of Set that takes a sequence as its input parameter (this is what you use when writing Set(a), where a is of type [Int], under the hood, the initializer adds the elements one by one checking whether any of the new elements are already present in the Set or not.
You could make a custom initializer method for Set that would throw an error if you would try to add a duplicate value to it, but it wouldn't really have any advantages for any users of Swift, hence the current implementation that just doesn't add the value if it is already present in the Set and doesn't throw an error. This way, you can safely and easily get rid of any duplicates in a non-unique collection of elements (such as an array).

Error Swift Programming How to search an Array of [Any]--> Generic parameter 'C.Generator.Element' cannot be bound to non-#ob

Hi I'm trying to search an array with various objects though I'm getting an error.
Generic parameter 'C.Generator.Element' cannot be bound to non-#ob
Here is my code I'm using:
var arraySearching = [Any]()
arraySearching = ["this","that",2]
find(arraySearching, 2)
How do you search arrays of type [Any]?
This is a misleading error message, but the problem is that find requires that the contents of the collection you are searching be Equatable, and Any isn’t. You can’t do much with Any at all in fact, never mind equate it with values of other types. You have to cast it to a real type and then use that.
That’d be easy to do in a closure, except there isn’t a version of find that takes a closure. But it’s easy to write one:
func find<C: CollectionType>(source: C, match: C.Generator.Element -> Bool) -> C.Index? {
for idx in indices(source) {
if match(source[idx]) { return idx }
}
return nil
}
Once you have this, you can use it to search for a value of a specific type in an array of Any:
find(arraySearching) {
// cast the candidate to an Int, then compare (comparing an
// optional result of as? works fine, nil == 2 is false)
($0 as? Int) == 2
}

Swift Dictionary lookup causing compile-time error

I'm dipping my toe into Swift, but have run into an issue that has me slightly confused. Given an integer index I'm trying to fetch the corresponding key of a Dictionary and return the value associated with it.
Using the following structure as an example:
Class CustomClass {
private var collection: [String: [SifterIssue]] = ["MyStringKey": [MyCustomCollectionClass]()]
/* ... */
}
I tried to solve the problem like so:
var keys = Array(self.collection.keys)
var key: String = keys[section] as String
return self.collection[key].count // error is flagged here
But found that this results in a compiler error, which states that 'String' is not convertible to 'DictionaryIndex'. Stumped, I tried a slightly more verbose solution and was surprised to find that this compiled and worked without issue.
var keys = Array(self.collection.keys)
var key: String = keys[section] as String
var collection: [MyCustomCollectionClass] = self.collection[key]! as [MyCustomCollectionClass]
return issues.count
Can anyone explain to me why the first solution refuses to compile?
As #Zaph said, ignoring potential fatal errors is a bad idea and it's something that swift was, in part, designed to help with. This is the most "swifty" code I could come up:
func collectionCount(#section: Int) -> Int? {
switch section {
case 0..<collection.count: // Make sure section is within the bounds of collection's keys array
let key = collection.keys.array[section] // Grab the key from the collection's keys array
return collection[key]!.count // We can force unwrap collection[key] here because we know that key exists in collection
default:
return nil
}
}
It uses the range/pattern matching feature of swift's switch statement to make sure that section in the bounds of collection's keys array; that felt more "swifty" than using if, mainly because I couldn't find a way to use swift's Range in an if statement. It also uses collection.keys lazy property array as a shortcut instead of creating a new Array with Array(collection.keys). Since we've already made sure that section is within the bounds of collection.keys, we can forcibly unwrap collection[key]! when we get its count.
Just for fun, I also made a generic function that takes a collection as input to generalize things:
func collectionCount<T,U>(#collection: [T:[U]], #section: Int) -> Int? {
switch section {
case 0..<collection.count: // Make sure section is within the bounds of collection's keys array
let key = collection.keys.array[section] // Grab the key from the collection's keys array
return collection[key]!.count // We can force unwrap collection[key] here because we know that key exists in collection
default:
return nil
}
}
[T:[U]] basically says that collection needs to be a Dictionary with key T whose values are an Array of U.
Ignoring the fatal potential error is a really bad idea. The whole reason for Optionals is to prevent crashes at runtime.
func collectionCount(#section: Int) -> Int? {
var keys = Array(self.collection.keys)
if section < keys.count {
var key = keys[section] as String
println("key: \(key)")
return self.collection[key]!.count
}
else {
// handle error here
return nil
}
}
Throwing in "!" unwrapping without knowing that the value can never be nil is much worse than the Objective-C handling of nil. If this becomes the standard way of handling Optionals by a substantial number of developers Swift will be a disaster. Please do not do this.

Why do I need a '<' overload for an Array class?

I'm trying to add functionality to an Array class.
So I attempted to add a sort() similar to Ruby's lexicon.
For this purpose I chose the name 'ricSort()' if deference to Swift's sort().
But the compiler says it can't find an overload for '<', albeit the 'sort({$0, $1}' by
itself works okay.
Why?
var myArray:Array = [5,4,3,2,1]
myArray.sort({$0 < $1}) <-- [1, 2, 3, 4, 5]
myArray.ricSort() <-- this doesn't work.
Here's a solution that is close to what you are looking for, followed by a discussion.
var a:Int[] = [5,4,3,2,1]
extension Array {
func ricSort(fn: (lhs: T, rhs: T) -> Bool) -> T[] {
let tempCopy = self.copy()
tempCopy.sort(fn)
return tempCopy
}
}
var b = a.ricSort(<) // [1, 2, 3, 4, 5]
There are two problems with the original code. The first, a fairly simple mistake, is that Array.sort returns no value whatsoever (represented as () which is called void or Unit in some other languages). So your function, which ends with return self.sort({$0 < $1}) doesn't actually return anything, which I believe is contrary to your intention. So that's why it needs to return tempCopy instead of return self.sort(...).
This version, unlike yours, makes a copy of the array to mutate, and returns that instead. You could easily change it to make it mutate itself (the first version of the post did this if you check the edit history). Some people argue that sort's behavior (mutating the array, instead of returning a new one) is undesirable. This behavior has been debated on some of the Apple developer lists. See http://blog.human-friendly.com/swift-arrays-the-bugs-the-bad-and-the-ugly-incomplete
The other problem is that the compiler does not have enough information to generate the code that would implement ricSort, which is why you are getting the type error. It sounds like you are wondering why it is able to work when you use myArray.sort but not when you try to execute the same code inside a function on the Array.
The reason is because you told the compiler why myArray consists of:
var myArray:Array = [5,4,3,2,1]
This is shorthand for
var myArray: Array<Int> = [5,4,3,2,1]
In other words, the compiler inferred that the myArray consists of Int, and it so happens that Int conforms to the Comparable Protocol that supplies the < operator (see: https://developer.apple.com/library/prerelease/ios/documentation/General/Reference/SwiftStandardLibraryReference/Comparable.html#//apple_ref/swift/intf/Comparable)[1]. From the docs, you can see that < has the following signature:
#infix func < (lhs: Self, rhs: Self) -> Bool
Depending on what languages you have a background in, it may surprise you that < is defined in terms of the language, rather than just being a built in operator. But if you think about it, < is just a function that takes two arguments and returns true or false. The #infix means that it can appear between its two functions, so you don't have to write < 1 2.
(The type "Self" here means, "whatever the type is that this protocol implements," see Protocol Associated Type Declaration in https://developer.apple.com/library/prerelease/ios/documentation/swift/conceptual/swift_programming_language/Declarations.html#//apple_ref/doc/uid/TP40014097-CH34-XID_597)
Compare this to the signature of Array.sort: isOrderedBefore: (T, T) -> Bool
That is the generic signature. By the time the compiler is working on this line of code, it knows that the real signature is isOrderedBefore: (Int, Int) -> Bool
The compiler's job is now simple, it just has to figure out, is there a function named < that matches the expected signature, namely, one that takes two values of type Int and returns a Bool. Obviously < does match the signature here, so the compiler allows the function to be used here. It has enough information to guarantee that < will work for all values in the array. This is in contrast to a dynamic language, which cannot anticipate this. You have to actually attempt to perform the sort in order to learn if the types can actually be sorted. Some dynamic languages, like JavaScript, will make every possible attempt to continue without failing, so that expressions such as 0 < "1" evaluate correctly, while others, such as Python and Ruby, will throw an exception. Swift does neither: it prevents you from running the program, until you fixed the bug in your code.
So, why doesn't ricSort work? Because there is no type information for it to work with until you have created an instance of a particular type. It cannot infer whether the ricSort will be correct or not.
For example, suppose instead of myArray, I had this:
enum Color {
case Red, Orange, Yellow, Green, Blue, Indigo, Violet
}
var myColors = [Color.Red, Color.Blue, Color.Green]
var sortedColors = myColors.ricSort() // Kaboom!
In that case, myColors.ricSort would fail based on a type error, because < hasn't been defined for the Color enumeration. This can happen in dynamic languages, but is never supposed to happen in languages with sophisticated type systems.
Can I still use myColors.sort? Sure. I just need to define a function that takes two colors and returns then in some order that makes sense for my domain (EM wavelength? Alphabetical order? Favorite color?):
func colorComesBefore(lhs: Color, rhs: Color) -> Bool { ... }
Then, I can pass that in: myColors.sort(colorComesBefore)
This shows, hopefully, that in order to make ricSort work, we need to construct it in such a way that its definition guarantees that when it is compiled, it can be shown to be correct, without having to run it or write unit tests.
Hopefully that explains the solution. Some proposed modifications to the Swift language may make this less painful in the future. In particular creating parameterized extensions should help.
The reason you are getting an error is that the compiler cannot guarantee that the type stored in the Array can be compared with the < operator.
You can see the same sort closure on an array whose type can be compared using < like an Int:
var list = [3,1,2]
list.sort {$0 < $1}
But you will get an error if you try to use a type that cannot be compared with <:
var URL1 = NSURL()
var URL2 = NSURL()
var list = [URL1, URL2]
list.sort {$0 < $1} // error
Especially with all the syntax you can leave out in Swift, I don't see a reason to define a method for this. The following is valid and works as expected:
list.sort(<)
You can do this because < actually defines a function that takes two Ints and returns a Bool just like the sort method is expecting.

Resources