Save previous bar values when using a loop in pine script - arrays

In pine script I'm calling a function that sums the previous bar value with an increment:
myFunction(myVar1) =>
var int myVar2 = 0
myVar2 := myVar1 + nz(myVar2[1],1)
The increment value is added using a loop that calls the function and the result is stored in an array:
myArray = array.new_int(0)
var int myVar1 = 1
myVar1 := 1
while myVar1 <= 3
array.push(myArray, myFunction(myVar1))
myVar1 += 1
The result in the first bar was expected. Since there is no previous bar the previous value is replaced by 1 nz(myVar2[1],1)
plot(myArray.get(myArray, 0))
plot(myArray.get(myArray, 1))
plot(myArray.get(myArray, 2))
Result: [2, 3, 4]
But in the second bar:
Result: [5, 6, 7]
My expected result: [3, 5, 7]
Since it runs the loop for the first bar first and then runs the loop again in the second bar it uses for myVar2[1] the last value 4 saved when running the last loop in the first bar.
How can the previous bar values be stored correctly when using a loop so that the expected results can be achieved:
First bar: [2, 3, 4]
Second bar: [3, 5, 7]
Third bar: [4, 7, 10]

Answer to your comment: You could save the current array in another array. That way, you always have access to the array values of the previous bar.
//#version=5
indicator("My Script", overlay=false)
var int myVar1 = na
var int[] myArray = array.new_int(3) // Current array
var int[] prevArray = array.new_int(3) // Previous array
myFunction(myVar1) =>
var int myVar2 = 0
myVar2 := myVar1 + nz(myVar2[1],1)
myVar1 := 1
prevArray := array.copy(myArray) // Save current array
array.clear(myArray) // Clear current array
while myVar1 <= 3
array.push(myArray, myFunction(myVar1))
myVar1 += 1
// Show previous array
plot(array.get(prevArray, 0), 'prevArray[0]')
plot(array.get(prevArray, 1), 'prevArray[1]')
plot(array.get(prevArray, 2), 'prevArray[2]')
// Show current array
plot(array.get(myArray, 0), 'myArray[0]')
plot(array.get(myArray, 1), 'myArray[1]')
plot(array.get(myArray, 2), 'myArray[2]')

Related

Trying to get the previous and next item of an array using index

What I'm trying to do is to create a new array which from an input array and iterate through it. Each item of the new array is the result of the multiplication of the previous and next item of the iteration.
For example:
Array input: [1, 2, 3, 4, 5, 6]
Array_final: [2, 3, 8, 15, 24, 30]
first item: 2*1 (because there's no previous item)
second item: 3*1
third item: 4*2
forth item: 5*3
fifth item: 6*4
sixth item: 6*5 (we use the current item because we don't have a next one)
This is my code and I don't understand why I keep getting array_final = [0, 0, 0, 0, 0, 0]
class Arrays
def self.multply(array)
array_final = []
last_index = array.length-1
array.each_with_index do |num, i|
if i == 0
array_final.push (num[i+1])
elsif i == last_index
array_final.push (num*num[i-1])
else
array_final.push(num[i+1]*num[i-1])
end
end
return array_final
end
end
You're using num as an array, when its an element.
I think you meant:
array.each_with_index do |num, i|
if i == 0
array_final.push (array[i+1])
elsif i == last_index
array_final.push (num*array[i-1])
else
array_final.push(array[i+1]*array[i-1])
end
end
You can use each_cons to get sequential items:
final = [input[0] * input[1]]
input.each_cons(3) do |precedent, _current, subsequent|
final << precedent * subsequent
end
final << input[-1] * input[-2]
Live example

Swift : Performing operations on certain elements of an array

So, something is bugging me with the syntax in Swift for performing operations on Arrays of Ints.
What I wanna do is this : I have an array of Ints which is outputted from a function, its size (count) varies between say 2 and 6 for now, depending on buttons I press in my app.
For each array that is outputted and that contain n ints, I want to create n arrays on which to perform an other action later on.
These "sub" arrays are supposed to be calculated this way :
newArray1's values should be array's values - the value of the first index of newArray1
newArray2's values should be array's values - the value of the second index of newArray2
etc... (I'll automate the number of newArrays according to the array.count)
An other condition applying for those new arrays is that if at a given index the value is negative, I add 12 (so it'll occur for newArray2 at index 1, for newArray3 at indexes 1 & 2, etc... as long as those newArrays are created).
Here's how I wanted to perform that (I created this with dummy arbitrary array in the playground for the sake of testing before inserting the correct stuff in my app code) :
var array : [Int] = [2,4,6,8,9]
var newArray2 = [Int]()
var increment2 = Int()
increment2 = array[1]
newArray2 = array.map {$0 - increment2}
for i in 0..<newArray2.count {
if array[i] < 0 {
newArray2[i] = array[i] + 12
} else {
newArray2[i] = array[i]
}
}
print(array)
print(newArray2)
So of course it doesn't work because I can't seem to figure how to correctly perform operations on Arrays...
Intuitively it seems in my first if statement I'm comparing not the element at index i but i itself, not sure how to reformat that though...
Any help is most welcome, thanks in advance ! :)
[EDIT: I just edited the names of newArray1 to newArray2, same for increments, so that I have negative values and it matches the index value of 1 which is the second element of my main array]
You seem to mean this:
let arr = [2,4,6,8,9]
var results = [[Int]]()
for i in arr.indices {
results.append(arr.map {
var diff = $0-arr[i]
if diff < 0 { diff += 12 }
return diff
})
}
// results is now:
// [[0, 2, 4, 6, 7],
// [10, 0, 2, 4, 5],
// [8, 10, 0, 2, 3],
// [6, 8, 10, 0, 1],
// [5, 7, 9, 11, 0]]

Swift define datatype of array

I need to somehow cast a 2dimensional array...
console:
solution.swift:22:23: error: missing argument label 'arr:' in call
array = invertArr(arr)
^
arr:
solution.swift:53:12: error: cannot convert return expression of type '[[Any]]' to return type '[[Int]]'
return result
^~~~~~
as! [[Int]]
and that's the code:
func invertArr(arr:[[Int]]) -> [[Int]]{
var counter = 0
var result = [[]]
for element in arr{
if counter == 0{
continue
}
var counter2 = 0
for item in element.reversed(){
result[counter][counter2] = item
counter2 += 1
}
counter += 1
}
return result
}
Thank you for helping!
Because of two serious issues your code cannot work anyway even if you call the method with the arr parameter label and declare result as [[Int]]().
As counter is 0 the first loop continues always and an empty array is returned.
result[counter][counter2] = item crashes reliably because there are no items at the given indices.
To invert the order of the items in the inner arrays this is a generic version
func invertArr<T>(arr: [[T]]) -> [[T]]{
var result = arr
for (index, element) in arr.enumerated() {
result[index] = element.reversed()
}
return result
}
let array = [[1, 2, 3, 4], [5, 6, 7, 8]]
let inverted = invertArr(arr: array) // [[4, 3, 2, 1], [8, 7, 6, 5]]
If you want to reverse also the items in the outer array return result.reversed()

Unexpected behavior with simple Swift code

I'm trying to get a var to get an array's value mirroring its position. Then divide it by another var's value getting it from another array mirroring its position. Then assign the result value to a third var.
Now, in a function I increment the 1st var by 1 and then append the resulting value to a third array.
Or so I thought.
It adds a value that is the simple division of the first two vars without incrementing the first var.
But in the console the result of the 3rd var (holding the division's result), it shows the increment.
It's only in the append part of the function that it doesn't take it into account.
I'm sure it's a syntax problem but I'd be really thankful if anyone could look at my code.
import UIKit
var positionA = 3
var positionB = 1
var arrayA = [0, 1, 2, 3, 4, 5, 6]
var arrayB = [0, 1, 2, 3, 4, 5, 6]
var arrayPosition = arrayA[positionA] / arrayB[positionB]
var arrayOfValues = [ 0 ,1 , 2, 3, 4]
print(arrayOfValues[1])
print(arrayOfValues[arrayPosition])
func increment() {
positionA = positionA + 1
arrayOfValues.append(positionA)
}
increment()
print(positionA)
print(arrayOfValues)
print(" ads a value of 4 to the arrayOfValues ( equal to positionA + 1) ")
func incrementB() {
positionA = positionA + 1
arrayOfValues.append(arrayPosition)
}
incrementB()
print(positionA)
print(arrayOfValues)
print("adds a value of 3 to the arrayOfValues that it still is equal to positionA, WITHOUT adding a +1 in the calculation of arrayPosition, when it SHOULD ADD a value of 5( positionA + 1 from increment(), +1 from incrementB) ")
arrayPosition is never changed after its initial assignment ( with a value of 3 resulting from 3 / 1 ).
Although positionA will indeed be 5 after the 2nd increment, that's not what you're adding to your arrayOfValues so the result is as expected (3).
[EDIT]
You could make it a computed variable like this
var arrayPosition : Int { return arrayA[positionA] / arrayB[positionB] }
It's just like a function but you don't need to use () to call it.

Reduce array to tuple of first and last element?

I have an array that I would like to first sort, then return the first and last element of the sorted array. I thought I can use reduce, but what if I don't have an initial value?
Here is the array I'm trying to work with:
let myNumbers = [4, 9, 6, 2, 3]
How can map this to the first and last of the sorted array to this?:
(2, 9)
Method 1: min()/max()
This is the easiest way:
let input = [4, 9, 6, 2, 3]
let output = (input.min(), input.max())
print(output) //(Optional(2), Optional(9))
If you're certain that the array isn't empty, you can safely force unwrap the optionals:
let input = [4, 9, 6, 2, 3]
let output = (input.min()!, input.max()!) // (2, 9)
This is approach does 2 iterations over the array. It's O(N). Unless a sorted list is required elsewhere, sorting then taking the first/last would be worse, as it would be O(N * log_2(N)).
Method 2: reduce()
If you insist on using reduce, you can do it like this:
let input = [4, 9, 6, 2, 3]
let output = input.reduce((min: Int.max, max: Int.min)){
(min($0.min, $1), max($0.max , $1))
} //(2, 9)
Each reduce iteration sets the accumulator to the new min (the smaller of the old min and current element), and the new max (the larger of the old max and the current).
The initial values of the accumulator are set such that:
Any element in the array compares as smaller than the accumulator's min
Any element in the array compares as larger than the accumulator's max
You don't need an initialValue for reduce, it's optional.
var foo = [1, 40, 20, -20, 50];
var reducer = function(prev, curr, i, arr){return [prev[0] <= curr ? prev[0] : curr, prev[1] >= curr ? prev[1] : curr]};
var baz = foo.reduce(reducer); // [-20, 50]
Or maybe like this:
var foo = [1, 40, 20, -20, 50];
var reducer = function(prev, curr, i, arr){return {min: prev.min <= curr ? prev.min : curr, max: prev.max >= curr ? prev.max : curr}};
var baz = foo.reduce(reducer); // {min: -20, max: 50}
Edit: Just noticed this is for swift and not javascript, whoops lol. I must have been surfing the wrong SO category. I think the principle would be the same in swift except you probably do need to provide some kind of initial value.

Resources