Swift enums as index to an array - arrays

How can I do a simple index into an array using enums in Swift?
I am a C programmer trying to understand Swift. This is perplexing.
var arr: [String] = ["2.16", "4.3", "0.101"]
enum Ptr: Int {
case first = 0
case second = 1
case third = 2
}
var ix = Int(Ptr.first)
print (ix)
let e = Double (arr[ix])
`
I would expect that Ptr.first would yield a 0 integer which I could as an index into array arr.

Why it doesn't work is because all cases in your enum (first, second, third) are essentially of type Ptr (the type of your enum). So in order to get 0 you need to access the case's rawValue property which is an Int:
var ix = Ptr.first.rawValue //accessing the case's rawValue property
print (ix)
let e = Double (arr[ix])
Hope that helps!

Related

Swift: Excluding a specific index from an array when filtering

I'd like to filter an array of numbers and use reduce on them, but I need to exclude a specific index and I can't divide. Is it possible to do this with methods that are part of Foundation in Swift?
I've tried breaking the array into two using prefix & suffix, but there are some edge cases where it blows up w/ an out of bounds exception.
while currentIndex < nums.count - 2 {
for _ in nums {
let prefix = nums.prefix(currentIndex)
let suffix = nums.suffix(from: currentIndex + 1)
if prefix.contains(0) || suffix.contains(0) {
incrementIndex(andAppend: 0)
}
let product = Array(prefix + suffix).reduce(1, *)
incrementIndex(andAppend: product)
}
}
You can use enumerated() to convert a sequence(eg. Arrays) to a sequence of tuples with an integer counter and element paired together
var a = [1,2,3,4,5,6,7]
var c = 1
let value = a.enumerated().reduce(1) { (_, arg1) -> Int in
let (index, element) = arg1
c = index != 2 ? c*element : c
return c
}
print(value) // prints 1680 i.e. excluding index 2
I'm seeking to figure out how to exclude a specific index
What about this sort of thing?
var nums2 = nums
nums2.remove(at:[currentIndex])
let whatever = nums2.reduce // ...
Where remove(at:) is defined here: https://stackoverflow.com/a/26308410/341994

Accessing Arrays in Dictionaries (swift)

I can't seem to find a solution for how to type out the syntax. I want to access and be able to modify individual integers within an array that's nested in a larger dictionary.
Here's an example:
var exampleDictionary = [ 1: [2,3] , 2: [4,5] , 3: [7,8] ]
print (exampleDictionary[2]!) // prints [4,5]
How can I access the first value of Key 2 (4 above)?
How can I change just the first value of Key 2?
I'm trying things like this and it's not working:
exampleDictionary[2[0]] = (exampleDictionary[2[0]] - 3) // want to change [4,5] to [1,5]
print (exampleDictionary[2[0]]) // should print "1"
exampleDictionary[2 : [0]] = (exampleDictionary[2 :[0]] - 3)
print (exampleDictionary[2 : [0]])
You should subscript the array not its index :
exampleDictionary[2]![0] = (exampleDictionary[2]?[0] ?? 0) - 3
Safer would be to use optional binding like so:
if var array = exampleDictionary[2] {
array[0] -= 3
exampleDictionary[2] = array
}
You have always to consider that Swift collection types are value types and you have to modify the object in place, something like
if let firstValueOfKey2 = exampleDictionary[2]?.first {
print(firstValueOfKey2)
exampleDictionary[2]!.first = firstValueOfKey2 - 3
print(exampleDictionary[2]!.first!)
}
First of all you have dictionary of type [Int: [Int]], which means every key have a value of array from Int.
1.If your exampleDictionary is of unrelated type, Specify the type of exampleDictionary to [Int: [Int]] so that you won't need to cast it in next step.
var exampleDictionary: [Int: [Int]] = [ 1: [2,3] , 2: [4,5] , 3: [7,8] ]
2.Access the key you want.
var key2: [Int] = exampleDictionary[2]!
var firstValueOfKey2: Int = key2.first!
3.Change the value for key2:
key2 = [1,5]
4.Because of Swift, Collections are value type which means that a pointer refers to new block in memory, this means that you cant change the value from root object directly from modifiers. then you should assign the last result to root object.
exampleDictionary[2] = key2

How to set array starting and end indices in swift?

I tried to create an array from 1-100 elements create in playground, but while i'm trying to print it doesn't print the values in the array.
code:
var ab:Array = [1...100]
for i in ab {
print(i)
}
output:
But in the playground it didn't shown any error.
Did i do anything wrong?
Thanks
You create an array of Range<Int> elements (a single one, 1..<101)
var ab: Array = [1...100] // element TYPE inferred from 1...100
// to be Range<Int>
print(ab.dynamicType)
// Array<Range<Int>>
But I assume you're attempting to create an array of 100 Int elements.
var ab = Array(1...100) // Array with elements intialized to Int,
// using Range<Int> to intialize
for i in ab {
print(i)
} // 1 2 3 ... 100
If you're only looking to print the numbers in the range 1...100, you needn't necessarily create an array if integers to do so (or an array at all). Instead, you could use a single Range<Int> variable and loop over the elements contained in this range. E.g.
let myRange = 1...5 // inferred as type Range<Int>
for i in myRange {
print(i) // the type of 'i' is Int, the same as in
// the array case above.
} // 1 2 3 4 5
Use clear and effective ;
var ab = Array(1...100)
for i in ab {
print(i)
}
Output
1 2 3.... 100

How to declare the array type in swift

So in a swift playground file, I am trying to execute the following closure:
var list = [5, 4, 3]
var Arraymultiplier = {(list:Array) -> Array in
for value in list {
value*2
return list
}
}
Arraymultiplier(list)
When I do this though, I get an error saying that the generic type Array must be referenced in <..> brackets, but when I put the brackets, I get another error.
What's the right way to declare the array type as an input and a return?
Array is a generic type, meaning that it expects to see the actual type of the members of the array to be specified within < > immediately following Array:
var arrayMultiplier = {(list: Array<Int>) -> Array<Int> in
// do whatever you want here
}
Or, in the case of arrays, there is a convenient syntax that uses the [ ] characters and omits the need for Array reference at all:
var arrayMultiplier = {(list: [Int]) -> [Int] in
// do whatever you want here
}
I'm not sure what the original code sample was trying to do, but it looks like you might have been trying to build a new array with each item multiplied by 2. If so, then you might do something like:
let inputList = [5, 4, 3]
let arrayMultiplier = { (list: [Int]) -> [Int] in
var results = [Int]() // instantiate a new blank array of `Int`
for value in list { // iterate through array passed to us
results.append(value * 2) // multiply each `value` by `2` and add it to the `results`
}
return results // return results
}
let outputList = arrayMultiplier(inputList)
Or, obviously, you could efficiently achieve similar behavior using the map function:
let outputList = inputList.map { $0 * 2 }

What is the immutable version to de/reference array?

How to de/reference the 3 array variables in this code instead of using mutable values?
The code below computes the Longest common subsequence (LCS) by diagonal traversing the m*n array.
The arguments are 2 char arrays like so:
So LCS method should result to length 4 as the longest common sub-sequence chars are "acbb" & "bcbb".
let private s1 = "ABCDBB".ToCharArray()
let private s2 = "CBACBAABA".ToCharArray()
let public lcs_seq_1d_diags (x:char[]) (y:char[]) =
let m = x.Length
let n = y.Length
let mutable dk2 = Array.create (1+m) 0
//printfn "\r\n0: %A" dk2
let mutable dk1 = Array.create (1+m) 0
//printfn "1: %A" dk1
let mutable dk = Array.create (1+m) 0
for k = 2 to m+n do
let low = max 1 (k-m)
let high = min (k-1) n
for j = low to high do
let i = k - j
if x.[i-1] = y.[j-1] then
dk.[i] <- dk2.[i-1] + 1
else
dk.[i] <- max dk1.[i] dk1.[i-1]
let mutable temp = dk2
dk2 <- dk1
dk1 <- dk
dk <- temp
dk1.[m]
let private res_seq_1d_rows = duration (fun () -> lcs_seq_1d_rows s1 s2)
//res_seq_1d_rows = 4
Take a look at the reference cells http://msdn.microsoft.com/en-us/library/dd233186.aspx
The syntax looks like this:
let a = ref 1 // declaring a reference
a := 2 // changing the reference value
printfn "%i" !a // dereferencing
This might also be interesting: F#: let mutable vs. ref
Arrays are mutable by default. Try using a list instead if you want immutability.
Try starting with this instead:
let s1 = List.ofSeq "ABCDBB"
let s2 = List.ofSeq "CBACBAABA"
The rest I leave as an exercise for the reader :-)

Resources