Swift remove objects in Array range - arrays

I have an array as a property in a class.
Class Custom {
let objArray: [CustomClass]
}
I want to remove some items in objArray in a range. So I have done below
let newVar = objArray[1...3]
new objects are correctly removed but return value is in newVar since array is value type how I can make the original reflect the same.
Below code gets Index out of bounds as the indexes incremented
for i in 1...3 {
objArray.remove(at: 1)
}
======
What is the best approach for the above issue.
Any hint in right direction would be highly appreciated.

Use removeSubrange method of array. Make a valid range by element location and length.
var array = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
let range = 1...3
array.removeSubrange(range)
print(array)
Output: [1, 5, 6, 7, 8, 9, 10]
Note: Range should be a valid range I mean it should not be out from array.
Here is yours way (by for loop)
We can not remove objects by their indexes in a loop because every time object removes array's count and objects indexes will be change so out of range crash can come or you might get a wrong output. So you will have to take help of another array. See below example:-
var array = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
var newArray: [Int] = []
let minRange = 1
let maxRange = 3
for i in 0..<array.count {
if i >= minRange && i <= maxRange {
/// Avoid
continue
}
newArray.append(array[i])
}
print(newArray)
Output: [1, 5, 6, 7, 8, 9, 10]

If you want to remove items by index in a range you have to inverse the indexes to start with the highest index otherwise you will get the out-of-range exception. Consider also that indexes are zero-based.
That's a safe version which checks also the upper bound of the array.
var array = [1, 2, 3, 4, 5, 6]
for i in (0...3).reversed() where i < array.count {
array.remove(at: i)
}
print(array) // [5, 6]
You can find a more generic and more efficient solution here

This solution also returns the removed values
extension Array {
/**
* ## Examples:
* var arr = [0,1,2,3]
* arr.remove((0..<2)) // 0,1
* arr // 2,3
*/
mutating func remove(_ range: Range<Int>) -> Array {
let values = Array(self[range])
self.removeSubrange(range)
return values
}
}

The issue you are having is that an array index is zero based, which is to say, the first element in an array is accessed bv:
Let firstArrayValue = objArray[0]
So in the case of your for loop, you need to subtact 1 from i to get the proper index value:
for i in 1…3 {
objArray.remove(at: i-1)
}
A better way is to loop through the indices by starting at 0. i = 0 will reference the first value in your objArray:
for i in 0...2 {
objArray.remove(at: i)
}
If you need to remove elements in the middle of the array you must first find their index location then remove. To find the index:
let indexLocation = objArray(indexOf: "Value in Array")
Then remove:
objArray.remove(at: indexLocation)

Related

Swift returning Array from ArraySlices

I'm working with a sorting function that takes an array of Ints already sorted in descending order and places a new Int in its correct spot. (i.e if my sorted array was [10, 7, 2] and the new int was 5, the function would return [10, 7, 5, 2]). The function for doing this, once it has found the correct spot for the new Int, slices the original array into the items before the new Ints spot and those after, and then combines the slices with the new Int.
The problem I'm running into is that this won't give me an array but rather an array slice.
Code:
func addToSorted(sorted: [Int], new: Int) -> [Int] {
if sorted.count == 0 {
return [new]
} else {
for index in 0..<sorted.count {
let item = sorted[index]
if new > item {
return sorted[..<index] + [new] + sorted[index...]
}
}
}
}
let result = addToSorted(sorted: [10, 7, 2], new: 5)
print(result) // expected [10, 7, 5, 2]
This is a more generic (and efficient) alternative which uses binary search
extension RandomAccessCollection where Element : Comparable {
func descendingInsertionIndex(of value: Element) -> Index {
var slice : SubSequence = self[...]
while !slice.isEmpty {
let middle = slice.index(slice.startIndex, offsetBy: slice.count / 2)
if value > slice[middle] {
slice = slice[..<middle]
} else {
slice = slice[index(after: middle)...]
}
}
return slice.endIndex
}
}
And use it
var array = [10, 7, 5, 2]
let index = array.descendingInsertionIndex(of: 4)
array.insert(4, at: index)
print(array) // [10, 7, 5, 4, 2]
For ascending order replace if value > slice[middle] with if value < slice[middle] and return slice.endIndex with return slice.startIndex
If you use the Swift Algorithms, this insertion is a one-liner:
var arr = [10, 7, 2]
arr.insert(5, at: arr.partitioningIndex {$0 < 5})
print (arr) // [10, 7, 5, 2]
This is very efficient — O(log n) — because your array is already partitioned (sorted) and therefore it uses a binary search.
You would have to promote the slices to arrays:
return Array(sorted[..<index]) + [new] + Array(sorted[index...])
A few other points:
You should make a habit out of using sorted.isEmpty over sorted.count == 0, it's much faster for some collections that don't store their count, such as lazy collections or even String (IIRC).
A better approach would be to just use Array.insert(_:at:):
var sorted = sorted // Make a local mutable copy
sorted.insert(new, at: index)
BTW after your for loop, you need insert at the end of your array (this also removes the need for checking the empty case):
return sorted + [new]
Since this works even when sorted is empty, you can remove that special case.
Since you know your data structure is already sorted, you can use binary search instead of linear search to find the insertion index faster.

Swift : Performing operations on certain elements of an array

So, something is bugging me with the syntax in Swift for performing operations on Arrays of Ints.
What I wanna do is this : I have an array of Ints which is outputted from a function, its size (count) varies between say 2 and 6 for now, depending on buttons I press in my app.
For each array that is outputted and that contain n ints, I want to create n arrays on which to perform an other action later on.
These "sub" arrays are supposed to be calculated this way :
newArray1's values should be array's values - the value of the first index of newArray1
newArray2's values should be array's values - the value of the second index of newArray2
etc... (I'll automate the number of newArrays according to the array.count)
An other condition applying for those new arrays is that if at a given index the value is negative, I add 12 (so it'll occur for newArray2 at index 1, for newArray3 at indexes 1 & 2, etc... as long as those newArrays are created).
Here's how I wanted to perform that (I created this with dummy arbitrary array in the playground for the sake of testing before inserting the correct stuff in my app code) :
var array : [Int] = [2,4,6,8,9]
var newArray2 = [Int]()
var increment2 = Int()
increment2 = array[1]
newArray2 = array.map {$0 - increment2}
for i in 0..<newArray2.count {
if array[i] < 0 {
newArray2[i] = array[i] + 12
} else {
newArray2[i] = array[i]
}
}
print(array)
print(newArray2)
So of course it doesn't work because I can't seem to figure how to correctly perform operations on Arrays...
Intuitively it seems in my first if statement I'm comparing not the element at index i but i itself, not sure how to reformat that though...
Any help is most welcome, thanks in advance ! :)
[EDIT: I just edited the names of newArray1 to newArray2, same for increments, so that I have negative values and it matches the index value of 1 which is the second element of my main array]
You seem to mean this:
let arr = [2,4,6,8,9]
var results = [[Int]]()
for i in arr.indices {
results.append(arr.map {
var diff = $0-arr[i]
if diff < 0 { diff += 12 }
return diff
})
}
// results is now:
// [[0, 2, 4, 6, 7],
// [10, 0, 2, 4, 5],
// [8, 10, 0, 2, 3],
// [6, 8, 10, 0, 1],
// [5, 7, 9, 11, 0]]

How to find a random index in array A which value does not appear in array B?

Let's say array A holds this:
[0, 1, 8, 3, 10, 6, 2]
And array B holds this:
[1, 2]
How can I generate a random index in array A which value does not appear in array B? Possible indexes in above example are:
0, 2, 3, 4, 5
But how to do this in Swift?
When you want to work with Array elements and their indices, enumerated() can be a good tool:
var a = [0, 1, 8, 3, 10, 6, 2]
var b = [1, 2]
var possibleIndices = a.enumerated()
.filter{!b.contains($0.element)}
.map{$0.offset}
print(possibleIndices)
//->[0, 2, 3, 4, 5]
(When b can be large, better make it a Set.)
And then:
(When we can assume b never holds all contents of a.)
var randomIndexToPossibleIndices = Int(arc4random_uniform(UInt32(possibleIndices.count)))
var randomIndex = possibleIndices[randomIndexToPossibleIndices]
If the assumption above cannot be satisfied, possibleIndices can be empty. So you'd better make randomIndex Optional:
var randomIndex: Int? = nil
if !possibleIndices.isEmpty {
var randomIndexToPossibleIndices = Int(arc4random_uniform(UInt32(possibleIndices.count)))
randomIndex = possibleIndices[randomIndexToPossibleIndices]
}
Thanks for Martin R.
First, you'd have to generate a diff between the 2 arrays ( unless they're both extremely large, in which case randomly trying recursively might result in better performance ).
Then all you have to do is find a random index you'd like to use and access said element:
#if os(Linux)
let j = Int(random() % ((count-1)))
#else
let j = Int(Int(arc4random()) % ((count-1)))
#endif
Will give you a proper index
If you then use this index and the element to find original element in your array you'll have your result.
If in case your elements are integers, and thus collisions can occur the thing I'd do would be recursively finding it to solve your problem. Remember that this can result in slow performance.
Look into the functional programming part of collections in swift here:
Swift Guide to map filter reduce
For instance you could use filter in the following way ( and I don't know if this is the best way ):
collection.filter {
var found = false;
for element in bCollection {
if element == $0 {
found = true;
}
}
return !found; // Might be better to turn true/false thing around in the above code to slightly improve performance.
}
How about working with sets?
let a = [0, 1, 8, 3, 10, 6, 2]
let b = [1, 2]
var setA = Set(a)
var setB = Set(b)
setA.subtract(setB)
var index: Int? = nil
if let first = setA.first {
index = a.index(of: first)
}
// if index == nil no such index exists

Reduce array to tuple of first and last element?

I have an array that I would like to first sort, then return the first and last element of the sorted array. I thought I can use reduce, but what if I don't have an initial value?
Here is the array I'm trying to work with:
let myNumbers = [4, 9, 6, 2, 3]
How can map this to the first and last of the sorted array to this?:
(2, 9)
Method 1: min()/max()
This is the easiest way:
let input = [4, 9, 6, 2, 3]
let output = (input.min(), input.max())
print(output) //(Optional(2), Optional(9))
If you're certain that the array isn't empty, you can safely force unwrap the optionals:
let input = [4, 9, 6, 2, 3]
let output = (input.min()!, input.max()!) // (2, 9)
This is approach does 2 iterations over the array. It's O(N). Unless a sorted list is required elsewhere, sorting then taking the first/last would be worse, as it would be O(N * log_2(N)).
Method 2: reduce()
If you insist on using reduce, you can do it like this:
let input = [4, 9, 6, 2, 3]
let output = input.reduce((min: Int.max, max: Int.min)){
(min($0.min, $1), max($0.max , $1))
} //(2, 9)
Each reduce iteration sets the accumulator to the new min (the smaller of the old min and current element), and the new max (the larger of the old max and the current).
The initial values of the accumulator are set such that:
Any element in the array compares as smaller than the accumulator's min
Any element in the array compares as larger than the accumulator's max
You don't need an initialValue for reduce, it's optional.
var foo = [1, 40, 20, -20, 50];
var reducer = function(prev, curr, i, arr){return [prev[0] <= curr ? prev[0] : curr, prev[1] >= curr ? prev[1] : curr]};
var baz = foo.reduce(reducer); // [-20, 50]
Or maybe like this:
var foo = [1, 40, 20, -20, 50];
var reducer = function(prev, curr, i, arr){return {min: prev.min <= curr ? prev.min : curr, max: prev.max >= curr ? prev.max : curr}};
var baz = foo.reduce(reducer); // {min: -20, max: 50}
Edit: Just noticed this is for swift and not javascript, whoops lol. I must have been surfing the wrong SO category. I think the principle would be the same in swift except you probably do need to provide some kind of initial value.

How do I fetch the i'th element from a Swift ArraySlice?

Below I am trying to fetch the i'th element of the ArraySlice draggignFan. The code builds fine (no warnings) but the program dies at runtime on the line where I try to index the slice like a normal array:
var draggingFan : ArraySlice<Card>?
...
if let draggingFan = draggingFan {
for i in 1 ..< draggingFan.count {
let card = draggingFan[i] // EXECUTION ERROR HERE
...
}
}
According to the docs there is a first and last method (which I use elsewhere with no problem). So how do I index an ArraySlice in Swift? (Note: I am intentionally skipping the 0'th index in the slice -- that's needed elsewhere).
The indices of the ArraySlice still match those of the original array. In your case, you are accessing index 1 which is not in your slice. If you offset the index by draggingFan.startIndex it will work:
if let draggingFan = draggingFan {
for i in 1 ..< draggingFan.count {
let card = draggingFan[draggingFan.startIndex + i]
...
}
}
Alternatively:
if let draggingFan = draggingFan {
for i in draggingFan.startIndex + 1 ..< draggingFan.endIndex {
let card = draggingFan[i]
...
}
}
This will access the values from the second element in the slice to the last element in the slice:
let original = [1,2,3,4,5,6] // Int array to demonstrate
var draggingFan : ArraySlice<Int>?
draggingFan = original[1...4] // create the slice
if let draggingFan = draggingFan {
// so there's no errors just slice the slice and iterate over it
for i in draggingFan[(draggingFan.startIndex+1)..<draggingFan.endIndex] {
print(i, terminator: ", ")
}
}
Output:
3, 4, 5,
The reason you are having this problem is that the slice maintains the original index numbers of the sequence you got it from. Thus, element 1 is not in this slice.
For example, consider this code:
let arr = [1,2,3,4,5,6,7,8,9]
let slice = arr[2...5]
Now what is slice[1]? It isn't 4, even though that is the second thing in the slice. It's 2, because the slice still points into the original array. In other words, slice[1] is out of the slice's range! That is why you're getting a runtime error.
What to do? Well, the actual indexes of the slice are its indices. That is what you want to cycle thru. But... You don't want the first element pointed to by the slice. So you need to advance the startIndex of the range you're going to iterate through. Thus:
if let draggingFan = draggingFan {
var ixs = draggingFan.indices
ixs.startIndex = ixs.startIndex.advancedBy(1)
for i in ixs {
// ... now your code will work ...
}
}
However, in my view, there's no need to index the slice at all, and you shouldn't be doing so. You should cycle through the slice itself, not thru its indexes. You have this:
for i in 1 ..< draggingFan.count
But that is much like saying
for aCard in draggingFan
...except that you want to drop the first element of the slice. Then drop it! Say this:
for aCard in draggingFan.dropFirst()
To see that this will work, try this in a playground:
let arr = [1,2,3,4,5,6,7,8,9]
let slice = arr[2...5]
for anInt in slice.dropFirst() {
print(anInt) // 4, 5, 6
}
As you can see, we are cycling through exactly the desired elements, with no reference to index numbers at all.
To iterate over the elements in the slice:
draggingFan?.forEach({ (element)
...
})
As far as I know, the get a specific element, it needs to be converted back to an array e.g.
let draggingFanArray = Array(draggingFan!)
Here's the playground code I used to toy around with various scenarios:
import Cocoa
var a: Array<Int>?
var b: ArraySlice<Int>?
a = [1, 2, 3, 4, 5, 6, 7]
b = a![3...5]
let count = b!.count
b!.forEach({ (element) in
print("\(element)")
})
let c = Array(b!)
print(c[2])
edit ArraySlice extension though:
extension ArraySlice {
func elementAtIndex(index: Int)->AnyObject?{
return Array(self)[index] as? AnyObject
}
}
If I have an array:
var arr = [1, 2, 3, 4, 5, 6, 7] // [1, 2, 3, 4, 5, 6, 7]
And I take a slice of the array:
let slice = arr[3..<arr.count] // [4, 5, 6, 7]
This slice will have a startIndex of 3, which means that indexing starts at 3 and ends at 6.
Now if I want a slice containing everything but the first element, I can use the dropFirst() method:
let sliceMinusFirst = slice.dropFirst() // [5, 6, 7]
And at this point, sliceMinusFirst has a startIndex of 4, which means my indexes range from 4 to 6.
Now if I wish to iterate over these to do something with the items, I can do the following:
for item in sliceMinusFirst {
print(item)
}
Alternatively, I can do it with forEach:
sliceMinusFirst.forEach { item in
print(item)
}
By using these forms of iteration, the fact that the startIndex is nonzero doesn't even matter, because I don't use the indices directly. And it also doesn't matter that, after taking a slice, I wanted to drop the first item. I was able to do that easily. I could have even done that at the time I wanted to do the iteration:
slice.dropFirst().forEach { item in
print(item)
}
Here I dropped the first item from the original slice, without creating any intermediate variables.
Remember that if you need to actually use the index, you're probably doing something wrong. And if you genuinely do need the index, make sure you understand what's going on.
Also if you want to get back to zero-based indexing once you make a slice, you can create an array from your slice:
let sliceArray = Array(slice) // [4, 5, 6, 7]
sliceArray.startIndex // 0

Resources