Swift Array() coercion ambiguous without more context — but only in extension - arrays

It is legal to say this (arr is an Array):
let arrenum = Array(arr.enumerated())
So why isn't it legal to say this?
extension Array {
func f() {
let arrenum = Array(self.enumerated())
// error: type of expression is ambiguous without more context
}
}
EDIT It seems this is a workaround:
extension Array {
func f() {
typealias Tup = (offset:Index, element:Element)
let arrenum = Array<Tup>(self.enumerated())
}
}
But why is that needed? (And is it right?)

This is a known bug (SR-1789). Swift currently has a feature where you can refer to a generic type within its own body without having to repeat its placeholder type(s) – the compiler will infer them for you to be the same as the type of self.
For example:
struct S<T> {
func foo(_ other: S) { // parameter inferred to be `S<T>`.
let x = S() // `x` inferred to be `S<T>`.
}
}
extension S {
func bar(_ other: S) {} // same in extensions too.
}
This is pretty convenient, but the bug you're running into is the fact that Swift will always make this inference, even if it's incorrect.
So, in your example:
extension Array {
func f() {
let arrenum = Array(self.enumerated())
// error: type of expression is ambiguous without more context
}
}
Swift interprets the code as let arrenum = Array<Element>(self.enumerated()), as you're in the body of Array<Element>. This is incorrect, because enumerated() yields a sequence of offset-element tuple pairs – Swift should have inferred Array to be Array<(offset: Int, element: Element)> instead.
One workaround, which you've already discovered, is to explicitly specify the placeholder type in order to prevent the compiler from making this incorrect inference.
extension Array {
func f() {
let arrenum = Array<(offset: Int, element: Element)>(self.enumerated())
}
}
Another possible workaround appears to be using the fully-qualified type, for example:
extension Array {
func f() {
let arrenum = Swift.Array(self.enumerated())
}
}
as it appears Swift doesn't do the same inference for fully-qualified types (I'm not sure if you should rely on this fact though).
Finally it's worth noting that instead of doing a call to Array's initialiser, you could use map(_:) instead to avoid the issue entirely:
extension Array {
func f() {
let arrenum = self.enumerated().map { $0 }
}
}
which, like the initialiser call, will give you back an array of offset-element pairs.

Related

Type Inference In swift

I'm trying to add a mutating method through Array extension. I'm creating a 2D array to do some calculations. But strangely Xcode is throwing me below error while creating 2D array
error: cannot convert value of type '[Int]' to expected argument type 'Int'. var bucket : [[Int]] = Array.init(repeating: Int, count: base)
My playgound code is this ,
extension Array where Element == Int {
public mutating func someTestMethod() {
let base = 10
var bucket : [[Int]] = Array.init(repeating: [Int](), count: base)
// Some Other Code
}
}
Where as the below code is working fine,
extension Array where Element == Int {
public mutating func someTestMethod() {
let base = 10
var bucket : [[Int]] = .init(repeating: [Int](), count: base)
// Some Other Code
}
}
Would like know why is this happening since type inference should work in both cases. I would appreciate any help in understanding what is happening here.
When omit the type for the init by only writing .init
var bucket : [[Int]] = .init(repeating: [Int](), count: base)
then the compiler deduces the init to call from the contextual type you have given, var bucket : [[Int]] so the complete init call is
Array<[Int]>.init(repeating: [Int](), count: base)
but if you use Array.init then the compiler uses the actual type of the extension which is given from the where condition to be Array<Int>

Extending Array with generic floating point math type

I'd like to achieve the following but can't fix the error: "Non-nominal type 'Element' does not support explicit initialization"
Original attempt:
public extension Array where Element: FloatingPointMathType {
func mean<Element>() -> Element {
let sum: Element = reduce (0.0, +) as! Element
return sum / Element(count) // ==> Non-nominal type 'Element' does not support explicit initialization
}
}
Also, I wonder why it requires the as! Element cast
As a comparaison, a local function as follows compiles with no issue:
func mean<Element: FloatingPointMathType>(_ e: [Element]) -> Element {
let sum: Element = e.reduce (0.0, +)
return sum / Element(e.count)
}
It's impossible to say what the problem is exactly, because we don't know how your FloatingPointMathType protocol is defined. There are a few issues in your implementation (chiefly, you don't want to define a generic function mean<Element>; the extension is already parametrized over the Element type, and the generic parameter is introducing a new type name that shadows it (and that new type is unbound, so you can't do anything with it).
The following works with the standard library's FloatingPoint protocol:
public extension Collection where Element: FloatingPoint {
func mean() -> Element {
reduce(into: 0, +=) / Element(count)
}
}

How to pass array to a GO function without define the size of the array?

I try to defined an array pass it to the function that doesn't define the size of the argument, errors occur however.
package main
import "fmt"
func main() {
var a=[5]int{1,2,3,4,5}
f(a,5)
fmt.Println(a)
}
func f(arr []int,size int) {
for i,x:=range arr {
fmt.Println(i,x)
arr[i]=100
}
}
cannot use a (type [5]int) as type []int in argument to f
You can convert the array to a slice inline, like so:
f(a[:],5)
Playground
For more background see: https://blog.golang.org/go-slices-usage-and-internals

Swift Array: Cannot invoke 'append' with an argument list of type '(Int)'

I'm trying to implement a padding function for Array:
extension Array {
func dataWithPadding(offset: Int, length: Int, paddingLength: Int) -> NSData {
var arr = Array(self[offset..<(offset + length)])
arr = arr.reverse()
for (var i = 0; i < paddingLength; i++) {
arr.append(0)
}
let d = NSData(bytesNoCopy: &arr, length: length)
return d
}
}
This errors at arr.append with:
Cannot invoke 'append' with an argument list of type '(Int)'
I try to change the declaration into:
var arr[UInt8] = Array(self[offset..<(offset + length)])
However, this also errors:
Cannot assign to immutable expression of type '[UInt8].Type' (aka 'Array.Type')
The weird part: I try to run the original code with arr.append commented out, and use lldb to directly run arr.append(0), it actually works.
I'm using Xcode 7.2.
Since your Array extension puts no constraint on which types of arrays (element types) that can be used with it, .append(0) cannot be invoked; not all types can be converted into from integer literals. Hence, it's not weird that you cannot use .append(0) in the "generic" array extension, whereas you naturally can use it directly on an array that Swift can infer to have integer literal convertible elements, e.g. [Int]. Consider the following example:
var arr : [UInt8] = []
arr.append(0)
var arr2 : [String] = []
arr2.append(0) // error
In the example above, both arrays would have access to your extension dataWithPadding, but both can naturally not make use of arr.append(0) in the extension, hence the error message
Cannot invoke 'append' with an argument list of type '(Int)'
Now, a simple fix is to add a type constraint for the array elements to IntegerLiteralConvertible, after which you extension is valid, and accessible for all arrays that have elements which conform to IntegerLiteralConvertible.
extension Array where Element: IntegerLiteralConvertible {
func dataWithPadding(offset: Int, length: Int, paddingLength: Int) -> NSData {
var arr = Array(self[offset..<(offset + length)])
arr = arr.reverse()
for (var i = 0; i < paddingLength; i++) {
arr.append(0)
}
let d = NSData(bytesNoCopy: &arr, length: length)
return d
}
}
Alternatively, make use of the less general SignedNumberType, UnSignedIntegerType or IntegerType as type constraint for Element; these conform also to e.g. Comparable and Equatable, in case you'd like to compare and perform operations on your generic elements in the extension.
Finally note that you can naturally use your own custom protocol as type constraint for Element in your extension, allowing you to include additional blueprints (below, foo() method) accessible for your Element:s in the extension
protocol MyIntegerLiteralInitializableTypes: IntegerLiteralConvertible {
func foo()
}
extension MyIntegerLiteralInitializableTypes {
func foo() {
print("I am of type \(self.dynamicType)!")
}
}
/* For the Array<Type>:s you want to have access to .dataWithPadding,
extend those 'Type':s to MyIntegerLiteralInitializableTypes */
extension Int8 : MyIntegerLiteralInitializableTypes { }
extension UInt8 : MyIntegerLiteralInitializableTypes { }
extension Array where Element: MyIntegerLiteralInitializableTypes {
func dataWithPadding(offset: Int, length: Int, paddingLength: Int) -> NSData {
self.first?.foo() /* I am of type ...! */
var arr = Array(self[offset..<(offset + length)])
arr = arr.reverse()
for (var i = 0; i < paddingLength; i++) {
arr.append(0)
}
let d = NSData(bytesNoCopy: &arr, length: length)
return d
}
}
It's also probably a good idea to add #warn_unused_result to your dataWithPadding(...) function signature, as a call to it without assigning the return will yield a runtime exception ("... malloc: ...: pointer being freed was not allocated").

Why does the same method fail when inside an Array extension in Swift?

I'm getting strange behavior when trying to call sort() from within an Array extension, e.g this method:
func test() {
let a = [1,2,3]
sort(a) { x, y in x < y }
}
Works on its own, but fails when its inside an Array extension:
extension Array {
func test() {
let a = [1,2,3]
sort(a) { x, y in x < y }
}
}
It's a build error, failing with:
Extra argument in call
Curiously the same method works as a String extension:
extension String {
func test() {
let a = [1,2,3]
sort(a) { x, y in x < y }
}
}
Why can't I call sort() from within an Array extension?
Because Array has a method called sort of its own, which is of the form sort(isOrderedBefore: (T, T) -> Bool)
So, when you call sort inside the scope of Array, you are actually referring to that version instead of the global sort function.
Thanks to this answer in a question of mine, I found that you can make sure you're calling the sort version in the global scope by using Swift's default namespace, Swift. So, the global version is accessible via Swift.sort.

Resources