I'm a rookie programmer and am having trouble directing my compiler to find certain image files using const char.
I'm reading through a book, "Programming 2D Games" by C. Kelly, and when I look through his
code, he uses this line to find his image file.
const char NEBULA_IMAGE[] = "pictures\NasaNebula.jpg"; // photo source nasaimages.org
When I do this I get an error that I cannot find the file. However if I use this line:
const char NEBULA_IMAGE[] = "E:/TestProject/pictures/NasaNebula.jpg"; //Photo source nasa
It works. Could someone let me know how I can configure my project so that it can find these files without defining their exact path? I've looked around for a while but can't find exactly what I need.
Thank you
Since that path is used at runtime (not at compile time) to locate and load that file you have to make sure that such a relative path makes sense from the current situation. You have to make sure that the current working directory of your application is such, that from that directory on the relative path leads to the file.
In your specific example the applications working directiory must be in E:/TestProject, so that the relative path pictures/NasaNebula.jpg leads to the full (and correct) path E:/TestProject/pictures/NasaNebula.jpg.
In general relative path offer a lot of flexibility. For example by using diffferent resource folders at runtime, thus using different files without having to change the source code of the application. But relative path also demand that the current situation of the app allows to resolve such paths.
Related
I am trying to find a read that differentiates the advantages of using a Path object over the File object in java. I see a comparison of the API here http://docs.oracle.com/javase/tutorial/essential/io/legacy.html but don't really see the real advantages anywhere. Any pointer will be appreciated!
Generally one could say that both classes have different focus.
File is designed for file handling (creating, deleting, ...) while Path is focused on filename parsing.
File seems to have most functionality of Path included but there may be special cases where Path suits better.
Please see the documentation sites (especially methods overview) java.nio.File and java.nio.file.Path for further information.
Do you think its a bad habit - or a problem - to include the custom user machine path on the poedit translation paths?
"X-Poedit-SearchPath-0: /myProjectName/Backend/module/Core\n"
"X-Poedit-SearchPath-1: /Users/someUser/Documents/Projects/clipp/Backend/module/Core\n"
The first path is the global one, but as developers enter in the projects entries like the second path starts to appearing. This entries are versioned too and ends to populate the base with many alternatives.
Yes, it’s a bad thing to do. I consider it a design failure that Poedit even allows this and 1.8 is going to fix this part of its UI.
You are seeing yourself why it’s bad: it makes the PO file not portable to other machines. You should use a relative path (from the location of the PO file to …/module/Core instead.
See https://github.com/vslavik/poedit/wiki/PO-Extensions for description of the X-Poedit-* headers.
I had some lua code with the following line:
JSON = loadfile("JSON.lua")()
The file JSON.lua is in the same directory as the lua code that line came from. This code worked for me for a while, and then, without my changing either the lua source, or the JSON.lua, or permission of any of the files, or the directory from where I was running the lua code, I started getting a nil error on that line. (I simply recall NO relevant changes that could have any impact on the lua code.)
Adding an assert revealed that the error was caused by the file not being found. Playing with file permissions, restarting my machine didn't resolve the issue, and pulling back code that I had checked in and was working perfectly did not resolve the error.
I resolved the error by changing the line above to provide the absolute path to that JSON.lua file.
Is there anything explaining why the code without the absolute path could have worked for a while and then stopped working?
Note: This behavior of working and then not working happened to me twice over a week. I am puzzled and though I have now found a fix, I am really curious as to the explanation for that intermittent behavior.
Lua uses package.path, whose default value comes from the environment variable LUA_PATH if it is set, as the list of directories to search. You can put . of the front of this list to load files from the current directory, or you can put your files in a path on the list.
A late answer on this, as I found exactly the same problem.
First, contrary to the previous answer, loadfile doesn't use the package.path search path. It only looks in the specified directory. And if you don't specify a directory, it only look in the 'current directory'. I can't explain exactly why it stopped working for you, but probably your Lua code is somehow being run with a different 'current directory' than previous.
There are two possible fixes: One is to specify an absolute path to loadfile.
JSON = loadfile("c:\\my_folder\\JSON.lua")()
The alternative fix depends on the particular library you're using, which I suspect is Jeffrey Friedl's Lua JSON lilbrary. Because this supports the newer Lua module mechanism, you can just load the module with require, which does support the package.path search path.
JSON = require("JSON")
What's the simplest way to find the path to the file in which I am "executing" some code? By this, I mean that if I have a file foo.py that contains:
print(here())
I would like to see /some/path/foo.py (I realise that in practice what file is "being executed" is complicated, but I think the above is well defined - a source file that contains some function that, when executed, gives the path to said file).
I have needed this in the past to make tests (that require some external file) self-contained, and I am currently wondering if it would be a useful way to locate some support files needed by a program. But I have never found a good way of doing this. The inspect module sounds like it should work, but you seem to need a class or function that is defined in that module.
In particular, the module instances contain __file__ attributes, but I can't see how to get the "current" module. Objects have a __module__ attribute, but that's the module name, not a module instance.
I guess one way is to throw and catch an exception and inspect the contents, but that seems like hard work. Surely there is a simple, easy way that I have missed?
To get the absolute path of the current file:
import os
os.path.abspath(__file__)
To get content of external file distributed with your package you could use pkg_util.get_data()(stdlib) or pkg_resources.resouce_string() (setuptools) to support execution from zip-archives or standalone executables created by py2exe, PyInstaller or similar, example.
I have a C program built using Autotools. In src/Makefile.am, I define a macro with the path to installed data files:
AM_CPPFLAGS = -DAM_INSTALLDIR='"$(pkgdatadir)"'
The problem is that I need to run make install before I can test the binary (since it needs to be able to find the data files).
I can define another macro with the path of the source tree so the data files can be located without installing:
AM_CPPFLAGS = -DAM_INSTALLDIR='"$(pkgdatadir)"' -DAM_TOPDIR='"$(abs_top_srcdir)"'
Now, I would like the following behavior:
If the binary was installed via make install, use AM_INSTALLDIR to fetch data files.
If the binary was not installed, use AM_TOPDIR to fetch data files.
Is this possible? Is there a better approach to this problem?
What I do (in https://http://rhdunn.github.com/cainteoir/) is:
const char *basedir = getenv("CAINTEOIR_DATADIR");
if (!basedir)
basedir = DATADIR "/" PACKAGE; // e.g. /usr/share/cainteoir-engine
and then run it (in tests/harness.py) as:
CAINTEOIR_DATADIR=`pwd`/data src/apps/metadata/metadata test_file.epub
This then allows the user to change the location of where to get the data if they wish.
Making the program able to use a run-time configuration as proposed by reece is a good solution. If for some reason you do not want it to be configurable at run-time, a common solution is to build a test binary differently than the installed binary (there are other problems associated with this, in particular ensuring that the program you are testing has behavior that is consistent with the program that is installed.) An easy way to do that is something like:
bin_PROGRAMS = foo
check_PROGRAMS = test-foo
test_foo_SOURCES = $(foo_SOURCES)
AM_CPPFLAGS = -DINSTALLDIR='"$(pkgdatadir)"'
test_foo_CPPFLAGS = -DINSTALLDIR='"$(abs_top_srcdir)"'
Rather than using a binary with a different name, you might want to have a dedicated tests directory and build the program using the same name as the original.
Note that I've changed the name from AM_INSTALLDIR to INSTALLDIR. Automake reserves names
beginning with "AM_" for its own use, and by using that name you are stomping on Automake's
namespace.
A bit of additional information first: The data files are under active development, and I have various scripts that need to call binaries using local data files, whereas installed binaries should use stable, installed data files.
My original solution made use of an environment variable, as proposed by reece. But I didn't want to manage setting up environment variables in various places, and I didn't want any risk of the wrong data files being picked up due to a mistake.
So the solution I ended up with was to define macros for both locations at build time, and add a flag (-local) to the binaries to force local data files to be used.