How to write func for the generic parameter in golang - arrays

I am trying to write a function Map, so that it can handle all the types of array.
// Interface to specify generic type of array.
type Iterable interface {
}
func main() {
list_1 := []int{1, 2, 3, 4}
list_2 := []uint8{'a', 'b', 'c', 'd'}
Map(list_1)
Map(list_2)
}
// This function prints the every element for
// all []types of array.
func Map(list Iterable) {
for _, value := range list {
fmt.Print(value)
}
}
But it throws the compile time error.
19: cannot range over list (type Iterable)
The error is correct because range require array, pointer to an array, slice, string, map, or channel permitting receive operations and here type is Iterable. I think problem that I am facing is, conversion of the argument type Iterable to array type. Please suggest, how could I use my function to handle generic array.

As Rob Pike mentions in this thread
Is it possible to express "any map", "any array" or "any slice" in a Go type switch?
No. The static types must be exact.
The empty interface is really a type, not a wildcard.
You only could iterate over a list of a specific type, like an interface with known functions.
You can see an example with "Can we write a generic array/slice deduplication in go?"
Even using reflection, to pass a slice as an interface{} would be, as this thread shows, error-prone (see this example).
Update Nov. 2021, 7 years later: CL 363434, for Go 1.18 (Q1 2022) actually introduces functions useful with slices of any type, using generics.
// Package slices defines various functions useful with slices of any type.
// Unless otherwise specified, these functions all apply to the elements
// of a slice at index 0 <= i < len(s).
package slices
import "golang.org/x/exp/constraints"
// Equal reports whether two slices are equal: the same length and all
// elements equal. If the lengths are different, Equal returns false.
// Otherwise, the elements are compared in index order, and the
// comparison stops at the first unequal pair.
// Floating point NaNs are not considered equal.
func Equal[T comparable](s1, s2 []T) bool {
if len(s1) != len(s2) {
return false
}
for i, v1 := range s1 {
v2 := s2[i]
if v1 != v2 {
return false
}
}
return true
}
Note that issue 50792 and CL 382834 show that:
We left constraints behind in the standard library because we believed it was fundamental to using generics, but in practice that hasn't proven to be the case.
In particular, most code uses any or comparable.
If those are the only common constraints, maybe we don't need the package.
Or if constraints.Ordered is the only other commonly used constraint, maybe that should be a predeclared identifier next to any and comparable.
Hence import "golang.org/x/exp/constraints" instead of import "constraints".

Your definition of Map is some unсomplete. Usual way to declare it would have mapper method.
Your example can be implemented at least this way
package main
import "fmt"
// Interface to specify something thet can be mapped.
type Mapable interface {
}
func main() {
list_1 := []int{1, 2, 3, 4}
list_2 := []string{"a", "b", "c", "d"}
Map(print, list_1)
Map(print, list_2)
}
func print(value Mapable){
fmt.Print(value)
}
// This function maps the every element for
// all []types of array.
func Map(mapper func(Mapable), list ... Mapable) {
for _, value := range list {
mapper(value)
}
}
It works. Need to say it's a bit of untyped. Because no, Go has not 'generics' in Hindley-Milner sence

Related

Is Swift array reversed()[n] efficient or not?

When you call reversed() on an array in Swift, you get a ReverseCollection which merely wraps the original array with reversed access. Thus this is extremely efficient:
let arr = [1,2,3,4]
for i in arr.reversed() { print(i) }
Nothing actually got reversed except the access; the time complexity of reversed here is O(1). Cool!
But when I index into reversed() by an integer and check Quick Help, it appears I've lost all that efficiency; I'm shown the Sequence reversed() which generates a new array:
let arr = [1,2,3,4]
let i = arr.reversed()[1] // ???? this is a different `reversed()`!
And this seems to be true, because a reversed() array does not, itself, support indexing by number:
let arr = [1,2,3,4]
let rev = arr.reversed()
let i = rev[1] // compile error!
So my question is: is it really true that indexing by number into a reversed() array, as in my second example, loses the efficiency of the ReverseCollection index reversal?
Yes, indexing by Int is causing you to lose your O(1) access into the reversed array. Quite the gotcha!
As you note, reversed() here is an overloaded method; on Array specifically, you have two definitions to choose from:
BidirectionalCollection.reversed(), which returns a ReversedCollection, and
Sequence.reversed(), which turns any sequence into a reversed [Element]
The overloading here is most confusing for Array itself, because it's the only Sequence type such that type(of: x) == type(of: x.reversed()).
The Swift type checker prefers more specific overloads over less-specific ones, so in general, the compiler will use the BidirectionalCollection overload instead of the Sequence one where possible. The rub: BidirectionalCollection has an opaque index type, and cannot be indexed using an Int; when you do index into the collection with an Int, the compiler is instead forced to choose the Sequence overload over the BidirectionalCollection one. This is also why your second code sample fails to compile: Swift code inference does not take into account surrounding context on other lines; on its own, rev is preferred to be a ReversedCollection<Array<Int>>, so attempting to index into it with an Int fails.
You can see this a little more clearly with the following:
func collType1<T: Collection>(_: T) {
print(T.self) // ReversedCollection<Array<Int>>
print(T.Index.self) // Index
}
func collType2<T: Collection>(_: T) where T.Index == Int {
print(T.self) // Array<Int>
print(T.Index.self) // Int
}
let x: [Int] = [1, 2, 3]
collType1(x.reversed())
collType2(x.reversed())
Lest you wonder whether the compiler can optimize around this when the fact of Int-based indexing appears to not have any other side effects, at the time of writing, the answer appears to be "no". The Godbolt output is a bit too long to reproduce here, but at the moment, comparing
func foo1(_ array: [Int]) {
if array.reversed()[100] > 42 {
print("Wow!")
}
}
with
func foo2(_ array: [Int]) {
if array.reversed().dropFirst(100).first! > 42 {
print("Wow!")
}
}
with optimizations enabled shows foo2 performing direct array access
cmp qword ptr [rdi + 8*rax + 24], 43
having optimized away the ReversedCollection wrapper entirely, while foo1 goes through significantly more indirection.
Ferber explained the reason very well.
Here's an ad-hoc solution (which may not be preferred by everyone, because we are extending types from the standard library):
// RandomAccessCollection ensures fast index creation
extension ReversedCollection where Base: RandomAccessCollection {
subscript(_ offset: Int) -> Element {
let index = index(startIndex, offsetBy: offset)
return self[index]
}
}
[1, 2, 3].reversed()[0] // 3

How to concatenate two arrays in Go

A basic question that I'm struggling to find an answer for as there are a lot of answers about how to join two slices using the append function and the spread operator which erroneously use the word 'array'.
I am new to Go and have made the assumption that using sized arrays is good practice where the size is known. However I am struggling to work with arrays as I can't figure out how to do simple operations such as concatenation. Here is some code.
var seven [7]int
five := [5]int{1,2,3,4,5}
two := [2]int{6,7}
//this doesn't work as both the inputs and assignment are the wrong type
seven = append(five,two)
//this doesn't work as the assignment is still the wrong type
seven = append(five[:],two[:])
//this works but I'm not using arrays anymore so may as well use slices everywhere and forget sizing
seven2 := append(five[:],two[:])
As far as I can see I can either just give up on arrays and use slices exclusively or I could write a loop to explicitly construct the new array. Is there a third option?
append() can only be used to append elements to a slice. If you have an array, you can't pass that directly to append().
What you may do is slice the array, so you get a slice (which will use the array as its backing store), and you can use that slice as the target and source of elements.
For example:
s := seven[:0]
s = append(s, five[:]...)
s = append(s, two[:]...)
fmt.Println(seven)
This will print (try it on the Go Playground):
[1 2 3 4 5 6 7]
Also note that since append() returns the resulting slice, it's possible to write all this in one line:
_ = append(append(seven[:0], five[:]...), two[:]...)
(Storing the result is not needed here because we have and want to use only the backing array, but in general that is not the case.)
This outputs the same, try it on the Go Playground. Although this isn't very readable, so it's not worth compacting it into a single line.
Although when you have the target array, "appending" arrays is nothing more than copying them to the target, to the proper position. For that, you may use the builtin copy() function too. Note that the copy() function also accepts only slices, so you have to slice the arrays here too.
copy(seven[:], five[:])
copy(seven[len(five):], two[:])
fmt.Println(seven)
This will output the same. Try this one on the Go Playground.
You can use copy
copy(seven[:], five[:])
copy(seven[5:], two[:])
fmt.Printf("%v\n", seven)
> [1 2 3 4 5 6 7]
You can concatenate two arrays in go using copy function
package main
import "fmt"
func main() {
five := [5]int{1, 2, 3, 4, 5}
two := [2]int{6, 7}
var n [len(five) + len(two)]int
copy(n[:], five[:])
copy(n[len(five):], two[:])
fmt.Println(n)
}
https://blog.golang.org/go-slices-usage-and-internals
Golang runtime used to check whether current index exceeds the maximum possible.
On the side of array, it look ups its type (which contain its len and reference to the element type), because that's type, that can be registered only at compile time.
// each array mention with unique size creates new type
array := [5]byte{1,2,3,4,5}
On the side of slice, it look ups their header which looks like:
type slice {
data *byte
len int
cap int // capacity, the maximum possible index
}
As you can see, any slice is a single structure with data and len, cap fields, meanwhile array is just single pointer to data (*byte).
When you trying to convert array to slice, it just creates slice header and fills fields with:
slice := array[:]
==
slice := Slice{}
slice.data = array
slice.len = type_of(array).len
slice.cap = type_of(array).len
you can do that simply by converting array into slice:
arr1 := [...]int {1,2,3,}
arr2 := [...]int {4,5,6, }
//arr3 = arr1 + arr2 // not allowed
// converting arrays into slice
slc_arr1, slc_arr2 := arr1[:], arr2[:]
slc_arr3 := make([]int, 0)
slc_arr3 = append(slc_arr1, slc_arr2...)
fmt.Println(slc_arr3) // [1 2 3 4 5 6]
There is a more general way of appending an array of any type(once Golang has generics, but for now this solution is specific to strings. Just change the type as appropriate). The notion of Fold comes from Functional Programming. Note I have also included a filter function which also uses Fold. The solution is not stack safe but in many cases that does not matter. It can be made stack safe with trampolining. At the end is an example of its usage.
func FoldRightStrings(as, z []string, f func(string, []string) []string) []string {
if len(as) > 1 { //Slice has a head and a tail.
h, t := as[0], as[1:len(as)]
return f(h, FoldRightStrings(t, z, f))
} else if len(as) == 1 { //Slice has a head and an empty tail.
h := as[0]
return f(h, FoldRightStrings([]string{}, z, f))
}
return z
}
func FilterStrings(as []string, p func(string) bool) []string {
var g = func(h string, accum []string) []string {
if p(h) {
return append(accum, h)
} else {
return accum
}
}
return FoldRightStrings(as, []string{}, g)
}
func AppendStrings(as1, as2 []string) []string {
var g = func(h string, accum []string) []string {
return append(accum, h)
}
return FoldRightStrings(as1, as2, g)
}
func TestAppendStringArrays(t *testing.T) {
strings := []string{"a","b","c"}
bigarray := AppendStrings(AppendStrings(strings, strings),AppendStrings(strings, strings))
if diff := deep.Equal(bigarray, []string{"a","b","c","c","b","a","a","b","c","c","b","a"}); diff != nil {
t.Error(diff)
}
}

Go: How to get length of slice in function?

I've got a function to which I want to feed different kinds of slices after which I want to loop over them and print their contents. The following code works:
func plot(data interface{}){
fmt.Println(data)
//fmt.Println(len(data))
}
func main() {
l := []int{1, 4, 3}
plot(l)
}
But when I uncomment the line in which I print the length of the slice, I get an error saying invalid argument data (type interface {}) for len.
Any idea how I would be able to get the length of the slice so that I can loop over it?
You should try to avoid using interface{} whenever possible. What you want to do can be done with reflection, but reflection is a necessary evil. It is really good for marshalling, but should be used sparingly. If you still want to use reflect, you can do something like this:
func plot(data interface{}) {
s := reflect.ValueOf(data)
if s.Kind() != reflect.Slice {
panic("plot() given a non-slice type")
}
for i := 0; i < s.Len(); i++ {
v := s.Index(i)
...
}
}
Even after doing this, v is a reflect.Value. You will then need to somehow convert that to something useful. Luckily, Value has many methods that can be used to convert it. In this case, v.Int() would return the value as an int64.
As hinted in comments you would have to use reflection to do this, something like the following:
var sliceLen int
switch reflect.TypeOf(data).Kind() {
case reflect.Slice:
sliceLen = s.Len();
default:
//error here, unexpected
}
}
Although go provides reflection to do these little tricks when you need to (as well as many other uses), it is often better to avoid wherever possible to maintain compiler type safety and performance, consider the pros/cons of having separate functions for different data types over this approach

Implement Scan interface to read json array into map

I am fetching a JSON array from PostgreSQL, and I would like to read it into a map. I am able to Unmarshal the values into a []string slice, but what I actually want is a map[string]bool.
I've written a custom type for the column with a Scan interface that converts the JSON array into a slice of strings first, then reads each string into the custom map type as keys.
type custMap map[string]bool
func (m *custMap) Scan(src interface{}) error {
b, ok := src.([]byte)
if !ok {
return error(errors.New("Error Scanning Array"))
}
s := make([]string, 0)
json.Unmarshal(b, &s)
for _, v := range s {
(*m)[v] = true
}
return nil
}
type data struct {
vals custMap `json: "vals"`
}
The query I am trying to scan returns a row with a column vals which is a JSON array: ["some", "arr", "vals"], where the custom type is used like so:
var d models.data
sqlDB.QueryRow().Scan(&d.vals)
My expected output is a struct with the following shape
{ vals: map[string]bool { "some": true, "arr": true, "vals": true }
This compiles fine, but my code panics with "assignment to entry in nil map"
How can I fix my Scan function? Is it even possible to do this with a map type?
You are calling your method Scan of type *custMap on a unitialised map. Initialize d.vals either like
d.vals = custMap{}
or
d.vals = make(custMap)
Other answers already provide an explanation.
The Go Programming Language Specification
Map types
A map is an unordered group of elements of one type, called the
element type, indexed by a set of unique keys of another type, called
the key type. The value of an uninitialized map is nil.
A new, empty map value is made using the built-in function make, which
takes the map type and an optional capacity hint as arguments:
make(map[string]int)
make(map[string]int, 100)
The initial capacity does not bound its size: maps grow to accommodate
the number of items stored in them, with the exception of nil maps. A
nil map is equivalent to an empty map except that no elements may be
added.
I don't see a make to initialize your map: "A nil map is equivalent to an empty map except that no elements may be added."

Swift: optional array count

In Objective-C, if I had the following property:
#property (strong, nonatomic) NSArray * myArray;
A method to return a number of objects in myArray would look like:
- (NSInteger) numberOfObjectsInMyArray
{
return [self.myArray count];
}
This would return either the number of objects in the array, or 0 if myArray == nil;
The best equivalent I can think of for doing this in Swift is:
var myArray: Array<String>?
func numberOfObjectsInMyArray() -> Int
{
return myArray ? myArray!.count : 0
}
So checking the optional array contains a value, and if so unwrap the array and return that value, otherwise return 0.
Is this the correct way to do this? Or is there something simpler?
Try using the nil coalescing operator.
According to the Apple Documentation:
The nil coalescing operator (a ?? b) unwraps an optional a if it contains a value, or returns a default value b if a is nil.
So your function could look like this:
func numberOfObjectsInMyArray() -> Int {
return (myArray?.count ?? 0)
}
I agree with others that this could be a bad idea for a number of reasons (like making it look like there is an array with a count of "0" when there isn't actually an array at all) but hey, even bad ideas need an implementation.
EDIT:
So I'm adding this because two minutes after I posted this answer, I came across a reason for doing exactly what the author wants to do.
I am implementing the NSOutlineViewDataSource protocol in Swift. One of the functions required by the protocol is:
optional func outlineView(_ outlineView: NSOutlineView,
numberOfChildrenOfItem item: AnyObject?) -> Int
That function requires that you return the number of children of the item parameter. In my code, if the item has any children, they will be stored in an array, var children: [Person]?
I don't initialize that array until I actually add a child to the array.
In other words, at the time that I am providing data to the NSOutlineView, children could be nil or it could be populated, or it could have once been populated but subsequently had all objects removed from it, in which case it won't be nil but it's count will be 0. NSOutlineView doesn't care if children is nil - all it wants to know is how many rows it will need to display the item's children.
So, it makes perfect sense in this situation to return 0 if children is nil. The only reason for calling the function is to determine how many rows NSOutlineView will need. It doesn't care whether the answer is 0 because children is nil or because it is empty.
return (children?.count ?? 0) will do what I need. If children is nil it will return 0. Otherwise it will return count. Perfect!
That looks like the simpler way.
The Objective-C code is shorter only because nil is also a form of 0, being a C-based language.
Since swift is strongly typed you don't have such a shorthand. In this specific case it requires a little more effort, but in general it saves you most of the headaches caused by loose typing.
Concerning the specific case, is there a reason for making the array optional in the first place? You could just have an empty array. Something like this might work for you:
var myArray: Array<String> = []
func numberOfObjectsInMyArray() -> Int {
return myArray.count
}
(Source for this information)
How about using optional for return value?
var myArray: Array<String>?
func numberOfObjectsInMyArray() -> Int? {
return myArray?.count
}
I think that this way is safer.
(Source for this information)

Resources