I have an issue with using the Excel compiler of the Pycel package. When the Excel complier reads an excel file and is assigned to a variable, lets say variable X, the variable is considered global (which shouldn't be). That is, when X is passed into a class as an input, any change to it is transferred to X even if it is not returned as an output from the class.
What I need it to be is similar to any variable in Python, the excel compiler should be considered local, unless it is defined as global. That is, any change to the compiler object inside a called class should not be transferred to the outside of the class.
Does anyone know how I can get a local behavior from the Excel compiler?
from pycel import ExcelCompiler
X = ExcelCompiler(filename)
Value1 = X.evaluate(CellToCalculate)
myClass.myMethod(X)
Value2 = X.evaluate(CellToCalculate)
class myClass
def myMethod(Z):
Z.evaluate(CellToCalculate)
Z.set_value(CellToCalculate, ValueToInsert)
I expect: Value1 = Value2
What I see: Value1 != Value2
Related
In react project, do you think of any possible way of assigning a type to a variable based on .env (or any other file) at the compile time?
I have a type let X: A|B|C|null = null. I assign the A, B, or C to the X based on the variable stored in the .env. The reason I want to do that is so the hover or "on click" on the X shows me/forwards me to the correct class.
I've come up with no solid solution yet
Environment variables are read in run-time
TypeScript doesn't exist anymore in run-time
So it's not possible
I am using arm compiler 6 and I have a use case where I need to place a number of variables under one particular section.
For a single variable , it works fine .
Like :
int var __attribute__((mysec))
In my scatter (linker) file, var variable appears correctly in the address pointed by “mysec” section.
Now, I have a usecase where I need to store var, var1 and var2 in the same section - “mysec”.
I also want to avoid using __attribute__((mysec)) for each variable.
I found something related to pragma that could help here , but nothing worked out .
Can you let me know if there is any way possible for this?
I have a class with a port of dimensions [x,y] which is connected to another class having a matching port. Now I want to provide value to these variables [x,y] through external function call in which I basically read a .xml file and obtain values for x and y. But Dymola gives an error for this since during compilation it comes out as a non fixed size array.
Screenshot of error is attached.
The array sizes are structural parameters and they usually cannot depend on external function calls because they should be known at compile time. This is however supported in for example OpenModelica where a dll of the external function is built and called and the results are fetched during model compilation.
The only way to support this in all tools is to generate the model using an external tool which reads the xml and changes the .mo file with the values read.
You could probably have something like Parameters.mo:
package Parameters
constant Integer nTube = <EXTERN_NTUBE>;
constant Integer nSeg = <EXTERN_NSEG>;
end Parameters;
and your external tool will read the XML and bind and in Parameters.mo which you can then use in your models via Parameters.nTube and Parameters.nSeg. Maybe it would be good to give some defaults so that it works to use this file directly:
package Parameters
constant Integer nTube = 1;
constant Integer nSeg = 2;
end Parameters;
and then your external tool will replace 1 and 2 with the needed values before compilation.
This should be improved in Dymola 2017 (without the need for modifying the Modelica code). In earlier versions of Dymola it should work if the you translate the C-functions called to compute nTube and nSeg.
If that does not help your complete code would be needed to analyze the problem.
this is a general VBA Array issue, it is not for MS Office apps (no tables involved).
I'm looking to find out how to create multiple one-dimension arrays at runtime (maybe even public ones), using data from a .csv file.
I can explain. This is an example of how the csv file would look:
------- CSV FILE ----------------------------
Colors,white,red,blue,green (... and so on)
Animals,cat,dog,wolf,bear (...and so on)
Food,cake,bread,garlic (...and so on)
...and so on, more rows
The opening part is solved,
even the part where each row is assigned to a temporary variable,
and more - the row is split into values and assigned to a temporary array.
So, I have:
tempArray1, containing ("Colors", "white", "red" ...etc)
tempArray2, containing ("Animals", "cat", "dog" ...etc)
...
The goal is to create (or to address to) an (existing) array
NAMED after the first value of each row and then assign the rest of the values from row to that array.
Please do not ask me why am I not using a multi-dimensional array.
I have my reasons.
A similar question related to this case is:
if I already have a one-dimension public array, defined, named and populated - let's say it is Colors() - how can I address to it using the value "Colors"?
Not only to address, but also to erase, redim or change values in it?
When I say "Colors" I mean a string value, not 'hard-coded' Colors() into the sub or function.
With respect to your "a similar question related to this case", you can do the following:
Create a public class module containing your array Colors()
Then, add a "Microsoft Script Control" ActiveX control (possibly to your form), and keep it hidden
Add code (as string) dynamically to your ScriptControl, and execute it. Now, if this code contains (as a string), say " Colors(1)="red" " , then it will actually modify the Colors array in your class-module.
Note: However, there's a catch. Since it is a class module, and not a
normal module, it will only modify the object created inside the
script-control. So, you might have to do all the rest of the coding
too in that script-control (by dynamically adding code to it and
executing it), otherwise, all changes would be lost as the scope of
that object would be limited to that code contained inside the
script-control
I have two things that I would like to do with my matlab function. I have a function that will get data from a series of files. Say i have this file structure:
Number 1:
I would like to be able to have some selected variables in a function become globally accessible and persist after the function has completed. That is say i have the following function:
[] = function(directory)
b = read(directory)
struct c = (do some stuff with b)
somehow globalise c
end
I understand that you can have c = function(directory) and then pass a variable however i would not like to o this, i would rather that the variable c just persists in the global scope. Can this be done?
Number 2:
Is it possible for a matlab function to have default arguments. Lets take the function from above. What I want to do is to have the directory default as the current directory. For example if the following data structure exists:
MatlabMainFolder
-> MatlabFunctions
-> Data
The user should be able to run function in Data and it will use the Data directory.But should also be able to do function('../Data') from the MatlabFunctions directy for the same effect. Is this possible?
First:
I think the following should make a variable globally accessible from within the function:
[] = function(directory)
b = read(directory)
global c; <------
struct tmp = (do some stuff with b)
c = tmp; <------
end
Then, add global c in all other files where you want c to be available, before using it. I would recommend just using c = function... if at all possible, though.
As suggested by #Ben, assignin can also be used to assign a variable into a different workspace. That way, from a function, any variable can be assigned in the 'base' workspace, which is the workspace accessible by all other files too. In this case:
assignin('base','c',c)
will create a variable c in the base workspace, with the same value as c in the function file. (Add this line at the end of the function).
Second:
You could do something like this:
function(path)
if nargin<1
path = '../Data';
end
% do things with path
end
This way, if function is called without inputs, the default path ../Data will be used, otherwise, the path given as input will be used. nargin will be equal to the number of arguments that were given as input.
To 1) globals as well as asignin are often frowned upon. You could also use persistent
Pseudocode snippet:
function out=fun((directory)
persistent out
if isempty(out)
fill out
return out;
Or even handle objects in case you feel like going modern ;)
To 2): Read about doc InputParser - it's more work than in many other languages - but it's possible.