I am trying to wrap a C library with Ruby-FFI. However, the function names from the library start with capital letters. As a result, it seems as if ffi is trying to generate constants, and when you try access them at runtime from Ruby, you get an error saying
NameError: uninitialized constant (name of function)
Is there a way to alias function names so that in ruby you can access them as methods with lower case names as they ought to be?
Wait, it seems that an example is shown here: https://github.com/ffi/ffi/wiki/Windows-Examples
What they do here is the following:
attach_function :message_box, :MessageBoxW, [ :pointer, :buffer_in, :buffer_in, :int ], :int
So it seems that attach_function allows you to pass the alias as the first parameter, followed by the usual parameters. Tried this and it seems it works.
Related
i have an array called #missing_ports this array has been updating from different subroutines and loops. So array need to be global. how to define this array as global in perl.
code contains
use warnings;
use strict;
i declared this array in start of the program without "my" keyword, facing below error
#missing_ports" requires explicit package name at experiment.pl
with my keyword able to resolve the error but array is null in the end.
how can manage this situation in perl?
The following creates a lexical variable:
my #missing_ports;
If it's placed the start of a file, the variable will be visible to the entire file.
The following creates a lexical variable that's aliased to a package (global) variable:
our #missing_ports;
You can use this in multiple places in the same package.
You will still need to use the variable's full name from other packages.
The following declares the package variable:
use vars qw( #missing_ports );
This is not lexically scoped.
You will still need to use the variable's full name from other packages.
And of course, you could always use the full name of the variable.
$main::missing_ports
This requires no declaration, but will warn if only referenced by name once. So it's better to combine it with our or use vars.
Punctuation variables (e.g. $_) are "super globals". Use of these variables without a package name doesn't default to a variable in the current package; it defaults to a variable in the root/main namespace. (e.g. package Foo; $x; $_; means package Foo; $Foo::x; $main::_;.) There's no means of making additional superglobals.
As a final note, all the approaches listed here other than my are extremely strong indications of bad code.
I have this Systemverilog testbench, in which I want to use a package written in VHDL.
When I do: 'include "desired_pkg.vhd", it appararantly interpretes is a Verilog package, as ModelSim reports:
Error: (vlog-13069) ** while parsing file included at C:/Users/VHDL/CO_code/CO_18_03/simulation/ed_sim/models/tb_top.sv(22)
** at C:/Users/VHDL/CO_code/CO_18_03/CO_simulation/mentor/020_regmaps_struct_pkg.vhd(1): near "--": syntax error, unexpected --, expecting class.
So it tries to interpret -- (comment in VHDL) as something in Verilog. How to include this package without rewriting it into Verilog?
you dont use include as that is a pre-processor directive, and assumes the included code is verilog. You need to import the code:
import vhdl_lib::desired_pkg::*
But be aware, importing VHDL from verilog is not defined by any standard, and is purely down to the tool whether it even works, and what items in the package are supported.
Thanks. It works with Modelsim.
In the seek of completeness, let me add some details below:
I compiled the Vhdl package with -mixedsvvh
I also added -mixedsvvh when compiling my SystemVerilog module
I added import my_pkg::*; at the beginning of my SystemVerilog module file
beware of case: it seems all variables are imported in low case (I mean, sometimes people are used to define Vhdl constants in HIGH_CASE but if you copy paste as is in your SystemVerilog, you will get "Undefined Variable")
My two cents ;)
The h2ph utility generates a .ph "Perl header" file from a C header file, but what is the best way to use this file? Like, should it be require or use?:
require 'myconstants.ph';
# OR
use myconstants; # after mv myconstants.ph myconstants.pm
# OR, something else?
Right now, I am doing the use version shown above, because with that one I never need to type parentheses after the constant. I want to type MY_CONSTANT and not MY_CONSTANT(), and I have use strict and use warnings in effect in the Perl files where I need the constants.
It's a bit strange though to do a use with this file since it doesn't have a module name declared, and it doesn't seem to be particularly intended to be a module.
I have just one file I am running through h2ph, not a hundred or anything.
I've looked at perldoc h2ph, but it didn't mention the subject of the intended mechanism of import at all.
Example input and output: For further background, here's an example input file and what h2ph generates from it:
// File myconstants.h
#define MY_CONSTANT 42
...
# File myconstants.ph - generated via h2ph -d . myconstants.h
require '_h2ph_pre.ph';
no warnings qw(redefine misc);
eval 'sub MY_CONSTANT () {42;}' unless defined(&MY_CONSTANT);
1;
Problem example: Here's an example of "the problem," where I need to use parentheses to get the code to compile with use strict:
use strict;
use warnings;
require 'myconstants.ph';
sub main {
print "Hello world " . MY_CONSTANT; # error until parentheses are added
}
main;
which produces the following error:
Bareword "MY_CONSTANT" not allowed while "strict subs" in use at main.pl line 7.
Execution of main.pl aborted due to compilation errors.
Conclusion: So is there a better or more typical way that this is used, as far as following best practices for importing a file like myconstants.ph? How would Larry Wall do it?
You should require your file. As you have discovered, use accepts only a bareword module name, and it is wrong to rename myconstants.ph to have a .pm suffix just so that use works.
The choice of use or require makes no difference to whether parentheses are needed when you use a constant in your code. The resulting .ph file defines constants in the same way as the constant module, and all you need in the huge majority of cases is the bare identifier. One exception to this is when you are using the constant as a hash key, when
my %hash = { CONSTANT => 99 }
my $val = $hash{CONSTANT}
doesn't work, as you are using the string CONSTANT as a key. Instead, you must write
my %hash = { CONSTANT() => 99 }
my $val = $hash{CONSTANT()}
You may also want to wrap your require inside a BEGIN block, like this
BEGIN {
require 'myconstants.ph';
}
to make sure that the values are available to all other parts of your code, including anything in subsequent BEGIN blocks.
The problem does somewhat lie in the require.
Since require is a statement that will be evaluated at run-time, it cannot have any effect on the parsing of the latter part of the script. So when perl reads through the MY_CONSTANT in the print statement, it does not even know the existence of the subroutine, and will parse it as a bareword.
It is the same for eval.
One solution, as mentioned by others, is to put it into a BEGIN block. Alternatively, you may forward-delcare it by yourself:
require 'some-file';
sub MY_CONSTANT;
print 'some text' . MY_CONSTANT;
Finally, from my perspective, I have not ever used any ph files in my Perl programming.
i saw the following kind of code :
g_print("%s\n",_("foo"));
i haven't seen this style of passing arguments to print function ,but then i tried these :
g_print("%s\n","foo");
g_print("%s\n",("foo"));
then i thought had something to do with gtk(i'm fairly new to it) , but then i tried the same thing with printf :
printf("%s\n",_("foo"));
printf("%s\n","foo");
printf("%s\n",("foo"));
and all the above do the same thing : print foo to stdout . So my question is does passing the argument as "foo" , _("foo") ,or ("foo") make any difference at all , or is any one syntactic sugar
for the others,both in the case of printf , as well as g_print ?
sorry if this turns out to be a duplicate question ,but i couldn't seem to put my finger on what i should have searched for exactly in the first place .
The _() is actually a C macro defined as:
#define _(x) some_func(x)
Don't confuse it with ("foo") or "foo". Both of these are same and are just C strings.
You are probably seeing some sort of gettext macro, such as the one described by the glib docs. A common source for these is including glib/gi18n.h directly or (most likely) indirectly.
Marks a string for translation, gets replaced with the translated
string at runtime.
That header file contains a few in this vein:
#define _(String) gettext (String)
I would like to check syntax of my perl module (as well as for imports), but I don't want to check for dynamic loaded c libraries.
If I do:
perl -c path_to_module
I get:
Can't locate loadable object for module B::Hooks::OP::Check in #INC
because B::Hooks::OP::Check are loading some dynamic c libraries and I don't want to check that...
You can't.
Modules can affect the scripts that use them in many ways, including how they are parsed.
For example, if a module exports
sub f() { }
Then
my $f = f+4;
means
my $f = f() + 4;
But if a it were to export
sub f { }
the same code means
my $f = f(+4);
As such, modules must be loaded to parse the script that loads it. To load a module is simply to execute it, be it written in Perl or C.
That said, some folks put together PPI to address the needs of people like you. It's not perfect —it can't be perfect for the reasons previously stated— but it will give useful results nonetheless.
By the way, the proper way to syntax check a module is
perl -e'use Module;'
Using -c can give errors where non exists and vice-versa.
The syntax checker loads the included libraries because they might be applying changes to the syntax. If you're certain that this is not happening, you could prevent the inclusion by manipulating the loading path and providing a fake b::Hooks::OP::Check.