Scala case class arguments instantiation from array - arrays

Consider a case class with a possibly large number of members; to illustrate the case assume two arguments, as in
case class C(s1: String, s2: String)
and therefore assume an array with size of at least that many arguments,
val a = Array("a1", "a2")
Then
scala> C(a(0), a(1))
res9: C = c(a1,a2)
However, is there an approach to case class instantiation where there is no need to refer to each element in the array for any (possibly large) number of predefined class members ?

No, you can't. You cannot guarantee your array size is at least the number of members of your case class.
You can use tuples though.
Suppose you have a mentioned case class and a tuple that looks like this:
val t = ("a1", "a2")
Then you can do:
c.tupled(t)

Having gathered bits and pieces from the other answers, a solution that uses Shapeless 2.0.0 is thus as follows,
import shapeless._
import HList._
import syntax.std.traversable._
val a = List("a1", 2) // List[Any]
val aa = a.toHList[String::Int::HNil]
val aaa = aa.get.tupled // (String, Int)
Then we can instantiate a given case class with
case class C(val s1: String, val i2: Int)
val ins = C.tupled(aaa)
and so
scala> ins.s1
res10: String = a1
scala> ins.i2
res11: Int = 2
The type signature of toHList is known at compile time as much as the case class members types to be instantiate onto.

To convert a Seq to a tuple see this answer: https://stackoverflow.com/a/14727987/2483228
Once you have a tuple serejja's answer will get you to a c.
Note that convention would have us spell c with a capital C.

Related

Intuitively explain why `List` is covariant but `Array` is invariant?

From List[+T] I understand a list of dogs is also a list of animals which aligns perfectly with the intuition. From def :: [B >: A](elem: B): List[B] I understand I can add an animal (B, less specific) to a list of dogs (A, more specific) and will get back a list of animals. This aligns with the intuition as well. So basically List is good.
From Array[T] I understand an array of dogs is not (could not be used in place of a) an array of animals which is rather counterintuitive. An array of dogs is indeed an array of animals as well but obviously Scala disagrees.
I was hoping someone intuitively explain why Array in invariant, preferably in terms of dogs (or cats).
There is Why are Arrays invariant, but Lists covariant? but I'm looking for a more intuitive explanation that doesn't (heavily) involve the type system.
Related to Why is Scala's immutable Set not covariant in its type?
The reason is pretty simple. Is because Array is a mutable collection. Remember there is a very easy rule of thumb about variance.
If it produces something it can be covariant.
If it consumes something it can be contravariant.
That is why Functions are contravariant on input and covariant on output.
Because Arrays are mutable they are in fact both producers and consumers of something, so they have to be invariant.
Let me show why it has to be like that with a simple example.
// Assume this compiles, it doesn't.
final class CovariantArray[+A] (arr: Array[A]) {
def length: Int = arr.length
def apply(i: Int): A = arr(i)
def update(i: Int, a: A): Unit = {
arr(i) = a
}
}
sealed trait Pet
final case class Dog(name: String) extends Pet
final case class Cat(name: String) extends Pet
val myDogs: CovariantArray[Dog] = CovariantArray(Dog("Luna"), Dog("Lucas"))
val myPets: CovariantArray[Pet] = myDogs // Valid due covariance.
val myCat: Cat = Cat("Milton")
myPets(1) = myCat // Valid because Liskov.
val myDog: Dog = myDogs(1) // Runtime error Cat is not Dog.
You can reproduce this error in Java using normal Arrays, Scala will simply not let you compile.

Why do we have functions that named componentN in Kotlin

I've just looked at Kotlin standard library and found some strange extension-functions called componentN where N is index from 1 to 5.
There are functions for all types of primitives. For example:
/**
* Returns 1st *element* from the collection.
*/
#kotlin.internal.InlineOnly
public inline operator fun IntArray.component1(): Int {
return get(0)
}
It looks curiously for me. I'm interested in developers motives. Is it better to call array.component1() instead of array[0] ?
Kotlin has many functions enabling particular features by convention. You can identify those by the use of the operator keyword. Examples are delegates, operator overload, index operator and also destructuring declarations.
The functions componentX allow destructuring to be used on a particular class. You have to provide these functions in order to be able to destructure instances of that class into its components. It’s good to know that data classes provide these for each of there properties by default.
Take a data class Person:
data class Person(val name: String, val age: Int)
It will provide a componentX function for each property so that you can destructure it like here:
val p = Person("Paul", 43)
println("First component: ${p.component1()} and second component: ${p.component2()}")
val (n,a) = p
println("Descructured: $n and $a")
//First component: Paul and second component: 43
//Descructured: Paul and 43
Also see this answer I gave in another thread:
https://stackoverflow.com/a/46207340/8073652
These are Destructuring Declarations and they're very convenient in certain cases.
val arr = arrayOf(1, 2, 3)
val (a1, a2, a3) = arr
print("$a1 $a2 $a3") // >> 1 2 3
val (a1, a2, a3) = arr
is compiled down to
val a1 = arr.component1()
val a2 = arr.component2()
val a3 = arr.component3()

Convert case class constructor parameters to String Array in Scala

I have a case class as follows:
case class MHealthUser(acc_Chest_X: Double, acc_Chest_Y: Double, acc_Chest_Z: Double, activityLabel: Int)
These form the schema of a Spark DataFrame, which is why I'm using a case class. I simply want to map these to an Array[String] so I can use the ParamValidators.inArray(attributes) method in Spark. I use the following code to map the constructor parameters to an array using reflection:
val attributes: Array[String] = MHealthUser.getClass.getConstructors.map(a => a.toString)
but this simply gives me an array of length 1 whereas I want an array of length 4, with the contents of the array being the dataset schema which I've defined, as a string. Otherwise I'm using the hard-coded values of the dataset schema, which is obviously inelegant.
In other words I want the output:
val attributes: Array[String] = Array("acc_Chest_X", "acc_Chest_Y", "acc_Chest_Z", "activityLabel")
I've been playing with this for a while and can't get it to work. Any ideas appreciated. Thanks!
I'd use ScalaReflection:
import org.apache.spark.sql.catalyst.ScalaReflection
import org.apache.spark.sql.types.StructType
ScalaReflection.schemaFor[MHealthUser].dataType match {
case s: StructType => s.fieldNames
case _ => Array[String]()
}
Outside Spark see Scala. Get field names list from case class

Scala Array Slicing with Tuple

I try to slice an 1D Array[Double] using the slice method. I've written a method which returns the start and end index as a tuple (Int,Int).
def getSliceRange(): (Int,Int) = {
val start = ...
val end = ...
return (start,end)
}
How can I use the return value of getSliceRange directly?
I tried:
myArray.slice.tupled(getSliceRange())
But this gives my a compile-Error:
Error:(162, 13) missing arguments for method slice in trait IndexedSeqOptimized;
follow this method with `_' if you want to treat it as a partially applied function
myArray.slice.tupled(getSliceRange())
I think the problem is the implicit conversion from Array to ArrayOps (which gets slice from GenTraversableLike).
val doubleArray = Array(1d, 2, 3, 4)
(doubleArray.slice(_, _)).tupled
Function.tupled[Int, Int, Array[Double]](doubleArray.slice)
(doubleArray.slice: (Int, Int) => Array[Double]).tupled
Two options here, the first one is to call your function twice:
myArray.slice(getSliceRange()._1, getSliceRange()._2)
or to save your Tuple beforehand:
val myTuple: (Int, Int) = getSliceRange()
myArray.slice(myTuple._1, myTuple._2)
Edit: I leave this here just in case but Peter Neyens posted the expected answer.

incredible implicit Array conversion in scala

According to Scaladoc, there is no method named map in Array class, but there is an implicit function implicit def intArrayOps (xs: Array[Int]): ArrayOps[Int] defined in scala.Predef. So you can apply map on Array(1,2,3,4) if you like. But What I am confused about is that the map result is of type Array[Int], not ArrayOps[Int]. Here is my test:
scala> val array = Array(1,2,3,4)
array: Array[Int] = Array(1, 2, 3, 4)
scala> array.map(x => x)
res18: Array[Int] = Array(1, 2, 3, 4)
scala> res18.isInstanceOf[Array[Int]]
res19: Boolean = true
scala> res18.isInstanceOf[scala.collection.mutable.ArrayOps[Int]]
warning: there wre 1 unchecked warnings; re-run with -unchecked for details
res20: Boolean = false
It indeed returns an array, as intended and as is convenient, there is no reason you would need an ArrayOps, it is intended only to provide extra methods to arrays. The doc is wrong.
The routine is actually not implemented in ArrayOps. As most collection methods, it is inherited from TraversableLike. And you see two map methods in the doc:
def map [B] (f: (T) ⇒ B): ArrayOps[B]
def map [B, That] (f: (T) ⇒ B)(implicit bf: CanBuildFrom[Array[T], B, That]): That
Only the second one exists (inherited from TraversableLike). It is intended to allow implementation of map in just one place (traversable like) while allways giving the best possible behavior. For instance, a String is a of Seq[Char], if you map with a function from character to character, you get a String, but if you map from collection to say Int, the result cannot be a String and it will just be a Seq. This is explained in much detail in the paper fighting the bit rot with types.
However, this makes for a very complex signature, which does not reflect the simplicity of using the method, and makes very poor documentation most of the time (you normally would have to chase to which CanBuildFrom in implicit scope would work). This was discussed in this most famous scala question of stack overflow. So the tool scaladoc was extended so that a simpler entry, corresponding to intended usage, may appear. If you look at the source of GenTraversableLike, where the routine is introduced, you will see the following in the scaladoc for map (and a similar one in many methods)
#usecase def map[B](f: A => B): $Coll[B]
Subtypes add in their doc #define Coll <className>, and map (among others) appears with the simplified signature, marked [Use case]. In the source of ArrayOps, there is a #define Coll ArrayOps where it should be Array.
You can use the REPL with the -Xprint:typer option to see what's going on. Here is the output of the map method, reformatted for easier reading:
$ scala -Xprint:typer
scala> Array(1,2,3,4).map(x => x)
[[syntax trees at end of typer]]// Scala source: <console>
// some lines deleted
private[this] val res0: Array[Int] =
scala.this.Predef.intArrayOps(scala.Array.apply(1, 2, 3, 4))
.map[Int, Array[Int]]
(( (x: Int) => x ))
(scala.this.Array.canBuildFrom[Int](reflect.this.Manifest.Int));
So simplifying for package names here is what happens:
intArrayOps(Array(1,2,3,4)) // converts to ArrayOps
.map[Int, Array[Int]] // calls map with parameter lists below
((x:Int) => x) // pass identity function as fisrt param
(Array.canBuildFrom[Int]// pass builder for Array[Int] as second param
(Manifest.Int)) // pass class manifest for Int
So there is indeed a conversion to ArrayOps (first line). It returns ArrayOps[Int].
The ArrayOps.map[Int, Array[Int]] method is then called on it. Then as didierd explain, the original signature for map - not the simplified signature - indicates that the return type inferred will be Array[Int]

Resources