Let's say I want to iterate over an array in a way that isn't well-supported by the Js/Belt standard library functions. For example, perhaps I need to examine pairs of elements at a time. With a list, this is straightforward to do in a recursive style:
let rec findDouble = (list) => switch list {
| list{a, b, ..._} when a == b => a
| list{_, b, ...rest} => findDouble(list{b, ...rest})
| _ => 0
}
list{7, 9, 10, 10, 11, 13} |> findDouble |> Js.log // 10
However, ReScript seems to gently discourage lists in favor of arrays (see: the clumsier list syntax and the absence of list equivalents of certain standard library functions like Belt.Map.fromArray), so I'm not sure if converting an array to a list just to use this style is idiomatic - especially if the function produces a list that must then turn back into an array.
Of course I can use mutability to implement the function in a traditional imperative way:
let findDouble = (arr) => {
let idx = ref(1)
let answer = ref(0)
while (idx.contents < Js.Array.length(arr)) && (answer.contents == 0) {
if arr[idx.contents] == arr[idx.contents - 1] {
answer := arr[idx.contents]
}
idx := idx.contents + 1
}
answer.contents
}
[7, 9, 10, 10, 11, 13] |> findDouble |> Js.log // 10
But this is ugly and runs counter to the functional bones of ReScript.
What is a clean, idiomatic way to implement this function?
You can still use recursion, just with incrementing an index instead of using the tail of the list:
let findDouble = arr => {
let rec loop = idx =>
if idx >= Array.length(arr) {
0
} else if arr[idx] == arr[idx - 1] {
arr[idx]
} else {
loop(idx + 1)
}
loop(1)
}
I'd like a function runningSum on an array of numbers a (or any ordered collection of addable things) that returns an array of the same length where each element i is the sum of all elements in A up to an including i.
Examples:
runningSum([1,1,1,1,1,1]) -> [1,2,3,4,5,6]
runningSum([2,2,2,2,2,2]) -> [2,4,6,8,10,12]
runningSum([1,0,1,0,1,0]) -> [1,1,2,2,3,3]
runningSum([0,1,0,1,0,1]) -> [0,1,1,2,2,3]
I can do this with a for loop, or whatever. Is there a more functional option? It's a little like a reduce, except that it builds a result array that has all the intermediate values.
Even more general would be to have a function that takes any sequence and provides a sequence that's the running total of the input sequence.
The general combinator you're looking for is often called scan, and can be defined (like all higher-order functions on lists) in terms of reduce:
extension Array {
func scan<T>(initial: T, _ f: (T, Element) -> T) -> [T] {
return self.reduce([initial], combine: { (listSoFar: [T], next: Element) -> [T] in
// because we seeded it with a non-empty
// list, it's easy to prove inductively
// that this unwrapping can't fail
let lastElement = listSoFar.last!
return listSoFar + [f(lastElement, next)]
})
}
}
(But I would suggest that that's not a very good implementation.)
This is a very useful general function, and it's a shame that it's not included in the standard library.
You can then generate your cumulative sum by specializing the starting value and operation:
let cumSum = els.scan(0, +)
And you can omit the zero-length case rather simply:
let cumSumTail = els.scan(0, +).dropFirst()
Swift 4
The general sequence case
Citing the OP:
Even more general would be to have a function that takes any sequence
and provides a sequence that's the running total of the input
sequence.
Consider some arbitrary sequence (conforming to Sequence), say
var seq = 1... // 1, 2, 3, ... (CountablePartialRangeFrom)
To create another sequence which is the (lazy) running sum over seq, you can make use of the global sequence(state:next:) function:
var runningSumSequence =
sequence(state: (sum: 0, it: seq.makeIterator())) { state -> Int? in
if let val = state.it.next() {
defer { state.sum += val }
return val + state.sum
}
else { return nil }
}
// Consume and print accumulated values less than 100
while let accumulatedSum = runningSumSequence.next(),
accumulatedSum < 100 { print(accumulatedSum) }
// 1 3 6 10 15 21 28 36 45 55 66 78 91
// Consume and print next
print(runningSumSequence.next() ?? -1) // 120
// ...
If we'd like (for the joy of it), we could condense the closure to sequence(state:next:) above somewhat:
var runningSumSequence =
sequence(state: (sum: 0, it: seq.makeIterator())) {
(state: inout (sum: Int, it: AnyIterator<Int>)) -> Int? in
state.it.next().map { (state.sum + $0, state.sum += $0).0 }
}
However, type inference tends to break (still some open bugs, perhaps?) for these single-line returns of sequence(state:next:), forcing us to explicitly specify the type of state, hence the gritty ... in in the closure.
Alternatively: custom sequence accumulator
protocol Accumulatable {
static func +(lhs: Self, rhs: Self) -> Self
}
extension Int : Accumulatable {}
struct AccumulateSequence<T: Sequence>: Sequence, IteratorProtocol
where T.Element: Accumulatable {
var iterator: T.Iterator
var accumulatedValue: T.Element?
init(_ sequence: T) {
self.iterator = sequence.makeIterator()
}
mutating func next() -> T.Element? {
if let val = iterator.next() {
if accumulatedValue == nil {
accumulatedValue = val
}
else { defer { accumulatedValue = accumulatedValue! + val } }
return accumulatedValue
}
return nil
}
}
var accumulator = AccumulateSequence(1...)
// Consume and print accumulated values less than 100
while let accumulatedSum = accumulator.next(),
accumulatedSum < 100 { print(accumulatedSum) }
// 1 3 6 10 15 21 28 36 45 55 66 78 91
The specific array case: using reduce(into:_:)
As of Swift 4, we can use reduce(into:_:) to accumulate the running sum into an array.
let runningSum = arr
.reduce(into: []) { $0.append(($0.last ?? 0) + $1) }
// [2, 4, 6, 8, 10, 12]
By using reduce(into:_:), the [Int] accumulator will not be copied in subsequent reduce iterations; citing the Language reference:
This method is preferred over reduce(_:_:) for efficiency when the
result is a copy-on-write type, for example an Array or a
Dictionary.
See also the implementation of reduce(into:_:), noting that the accumulator is provided as an inout parameter to the supplied closure.
However, each iteration will still result in an append(_:) call on the accumulator array; amortized O(1) averaged over many invocations, but still an arguably unnecessary overhead here as we know the final size of the accumulator.
Because arrays increase their allocated capacity using an exponential
strategy, appending a single element to an array is an O(1) operation
when averaged over many calls to the append(_:) method. When an array
has additional capacity and is not sharing its storage with another
instance, appending an element is O(1). When an array needs to
reallocate storage before appending or its storage is shared with
another copy, appending is O(n), where n is the length of the array.
Thus, knowing the final size of the accumulator, we could explicitly reserve such a capacity for it using reserveCapacity(_:) (as is done e.g. for the native implementation of map(_:))
let runningSum = arr
.reduce(into: [Int]()) { (sums, element) in
if let sum = sums.last {
sums.append(sum + element)
}
else {
sums.reserveCapacity(arr.count)
sums.append(element)
}
} // [2, 4, 6, 8, 10, 12]
For the joy of it, condensed:
let runningSum = arr
.reduce(into: []) {
$0.append(($0.last ?? ($0.reserveCapacity(arr.count), 0).1) + $1)
} // [2, 4, 6, 8, 10, 12]
Swift 3: Using enumerated() for subsequent calls to reduce
Another Swift 3 alternative (with an overhead ...) is using enumerated().map in combination with reduce within each element mapping:
func runningSum(_ arr: [Int]) -> [Int] {
return arr.enumerated().map { arr.prefix($0).reduce($1, +) }
} /* thanks #Hamish for improvement! */
let arr = [2, 2, 2, 2, 2, 2]
print(runningSum(arr)) // [2, 4, 6, 8, 10, 12]
The upside is you wont have to use an array as the collector in a single reduce (instead repeatedly calling reduce).
Just for fun: The running sum as a one-liner:
let arr = [1, 2, 3, 4]
let rs = arr.map({ () -> (Int) -> Int in var s = 0; return { (s += $0, s).1 } }())
print(rs) // [1, 3, 6, 10]
It does the same as the (updated) code in JAL's answer, in particular,
no intermediate arrays are generated.
The sum variable is captured in an immediately-evaluated closure returning the transformation.
If you just want it to work for Int, you can use this:
func runningSum(array: [Int]) -> [Int] {
return array.reduce([], combine: { (sums, element) in
return sums + [element + (sums.last ?? 0)]
})
}
If you want it to be generic over the element type, you have to do a lot of extra work declaring the various number types to conform to a custom protocol that provides a zero element, and (if you want it generic over both floating point and integer types) an addition operation, because Swift doesn't do that already. (A future version of Swift may fix this problem.)
Assuming an array of Ints, sounds like you can use map to manipulate the input:
let arr = [0,1,0,1,0,1]
var sum = 0
let val = arr.map { (sum += $0, sum).1 }
print(val) // "[0, 1, 1, 2, 2, 3]\n"
I'll keep working on a solution that doesn't use an external variable.
I thought I'd be cool to extend Sequence with a generic scan function as is suggested in the great first answer.
Given this extension, you can get the running sum of an array like this: [1,2,3].scan(0, +)
But you can also get other interesting things…
Running product: array.scan(1, *)
Running max: array.scan(Int.min, max)
Running min: array.scan(Int.max, min)
Because the implementation is a function on Sequence and returns a Sequence, you can chain it together with other sequence functions. It is efficient, having linear running time.
Here's the extension…
extension Sequence {
func scan<Result>(_ initialResult: Result, _ nextPartialResult: #escaping (Result, Self.Element) -> Result) -> ScanSequence<Self, Result> {
return ScanSequence(initialResult: initialResult, underlying: self, combine: nextPartialResult)
}
}
struct ScanSequence<Underlying: Sequence, Result>: Sequence {
let initialResult: Result
let underlying: Underlying
let combine: (Result, Underlying.Element) -> Result
typealias Iterator = ScanIterator<Underlying.Iterator, Result>
func makeIterator() -> Iterator {
return ScanIterator(previousResult: initialResult, underlying: underlying.makeIterator(), combine: combine)
}
var underestimatedCount: Int {
return underlying.underestimatedCount
}
}
struct ScanIterator<Underlying: IteratorProtocol, Result>: IteratorProtocol {
var previousResult: Result
var underlying: Underlying
let combine: (Result, Underlying.Element) -> Result
mutating func next() -> Result? {
guard let nextUnderlying = underlying.next() else {
return nil
}
previousResult = combine(previousResult, nextUnderlying)
return previousResult
}
}
One solution using reduce:
func runningSum(array: [Int]) -> [Int] {
return array.reduce([], combine: { (result: [Int], item: Int) -> [Int] in
if result.isEmpty {
return [item] //first item, just take the value
}
// otherwise take the previous value and append the new item
return result + [result.last! + item]
})
}
I'm very late to this party. The other answers have good explanations. But none of them have provided the initial result, in a generic way. This implementation is useful to me.
public extension Sequence {
/// A sequence of the partial results that `reduce` would employ.
func scan<Result>(
_ initialResult: Result,
_ nextPartialResult: #escaping (Result, Element) -> Result
) -> AnySequence<Result> {
var iterator = makeIterator()
return .init(
sequence(first: initialResult) { partialResult in
iterator.next().map {
nextPartialResult(partialResult, $0)
}
}
)
}
}
extension Sequence where Element: AdditiveArithmetic & ExpressibleByIntegerLiteral {
var runningSum: AnySequence<Element> { scan(0, +).dropFirst() }
}
I want to write three for loops with promises something like this:
for i = 1 .. 3
for j = 1 .. 5
for k = 1 .. 6
post call to db to check if there is item on location i, j, k
step 1: check 1,1,1
step 2: check 1,1,2 item is found or inc next index
...
I searched for similar questions but I get only one-dimensional array loops examples.
I found this very hard problem and I hope you guys can help me. Thanks.
piping promises can be done linearly. So in your case, you have to first generate an array which holds all possible combinaisons like so:
$locations = [
[1, 1, 1], [1, 1, 2], [1, 1, 3], [1, 1, 4],
....
];
Then you can easily loop over this array and pipe your promises. Hope this helps
You can't combine a synchronous for loop with asynchronous operations and get things to sequence properly because there's no way to make a for loop "wait" for a promise to finish. The for loop runs synchronously so it will just start all the async operations at once.
So, instead you have to do your iterations another way. If you were iterating just one parameter, there would be a number of ready-made ways to do that, but I'm not aware of any pre-built solutions for iterating three nested variables so you will have to build your own. Here's one way to do it. This method is custom coded for your iterations which makes it a bit less code that a general scheme:
// fn gets called like this fn(i, j, k) and must return a promise
function iterateLevels(fn) {
var i = 1, iMax = 3;
var j = 1, jMax = 5;
var k = 1, kMax = 6;
function next() {
if (k > kMax) {
j++;
k = 1;
}
if (j > jMax) {
i++;
j = 1;
}
if (i > iMax) {
return;
}
return fn(i, j, k).then(function(result) {
k++;
// process result here
// if you want to continue processing, then
return next();
});
}
return next();
}
Here's a working demo implementation using a promise with a random delay: https://jsfiddle.net/jfriend00/q2Lnhszt/
The next() function with the outer scope variables i, j and k is essentially a state machine where each time you call next() it runs the next iteration and updates its state.
This could be made generic so you pass in how many levels of iteration and what the start and stop values for each level are and you could pass in a function for processing the result. Making it generic adds more code to the implementation and makes the logic a little harder to follow.
When do we use the each closure and when do we use the for loop in Groovy. Both effectively do the same thing.
groovy:000> a = [1, 2, 3, 4]
===> [1, 2, 3, 4]
groovy:000> a.each {
groovy:001> println it
groovy:002> }
1
2
3
4
===> [1, 2, 3, 4]
groovy:000> for (it in a) {
groovy:001> println it
groovy:002> }
1
2
3
4
===> null
The each closure returns the list. So I can probably use it instead of the for loop but not vice versa, correct?
The for loop is probably somewhat faster and more memory efficient, since the each construct is syntactic sugar. But I can't imagine a situation where it would really make a difference.
By my lights, the only significant difference is you can't terminate an each closure without throwing an exception. You can terminate a for loop with a break.
I do think the each construct reads a bit more idiomatic.
for loop is faster as You can in this question and this answer, but remember that premature optimization is the root of all evil in programming so I can't find any scenario that it really makes difference.
each also returns the whole, unmodified collection that it was operating on and here is a particular scenario that OP found it useful. for construct doesn't return anything.
It is generally Groovier to prefer each, because changing the code will be easier in the future. The each method is related to a family of functional methods that are ubiquitous in Groovy; by contrast, the for loop is mired in the imperative style of Java.
As a starting example, imagine that we want foo to call bar for each item:
def bar = { println it }
def foo = { def a ->
a.each { bar it }
}
foo( [1,2,3,4] )
Now, what if we want to call bar only for even numbers:
def bar = { println it }
def foo = { def a ->
a.findAll{ (it % 2) == 0 }.each{ bar it }
}
foo( [1,2,3,4] )
What if we discover that bar needs the list index of the item?
def bar = { def item, def index -> println "${index} ${item}" }
def foo = { def a ->
a.eachWithIndex{ def x, def i -> bar x, i }
}
foo( [1,2,3,4] )
Try writing these examples with the for loop. Sure, it can be done, but the changes seem inelegant. (Note, as mentioned in comments, you'll want to learn collect and other methods. For vs each is somewhat a false dichotomy.)