In Meteor, are there any folders where I can put a .zip which will not be sent to the client?
Secondary question: how can I make temporary download links on the app, which self-destruct after a period of time?
The idea is that only the server will have access to this file. /server doesn't seem to work because any files I place in there that are not code are not included in the final bundle.
My Solution - Heroku Filesystem
This is probably not the best solution to this problem - however, to anyone else that needs to have files bundled with the app which cannot be seen by the client, here's how I did it.
Note that deleting the secure files is done because Heroku does not persist filesystem changes on restart.
Place files in a folder named "securefiles" or similar in your /public folder.
These get compiled to a folder named /static in the bundle. Note that if you're using the Heroku buildpack, the actual path to the working directory for the server is /app/.meteor/heroku_build/app/.
Next, on server start, detect if the app is bundled or not. You can do this by checking for the existence of the static folder, and there's probably other files unique to a bundle as well.
If you're bundled, copy the files out of public using ncp. I've made a meteorite package just for this purpose, use mrt add ncp to add the node copy tool to your project. I recommend copying to the root directory of the app, as this is not visible to clients.
Next, delete the folder from static.
At this point you have files which can only be accessed by the server. Here's some sample coffeescript to do this:
Meteor.startup ->
fs = __meteor_bootstrap__.require 'fs'
bundled = fs.existsSync '/app' #Checking /app because on heroku app is stored in root / app
rootDir = if bundled then "/app/.meteor/heroku_build/app/" else "" #Not sure how to get the path to the root directory on a local build, this is a bug
if fs.existsSync rootDir+"securefiles"
rmDir rootDir+"securefiles"
#Do the same with any other temporary folders you want to get rid of on startup
#now copy out the secure files
ncp rootDor+'static/securefiles', rootDir+'securefiles', ()->
rmdir rootDir+'static/securefiles' if bundled
Secure/Temporary File Downloads
Note this code has dependencies on the random package and my package ncp
It's very easy to add on to this system to support temporary file downloads, as I have done in my project. Here's how, run url = setupDownload("somefile.rar", 30) to create a temporary file download link.
setupDownload = (dlname, timeout) ->
if !timeout?
timeout = 30
file = rootDir+'securefiles/'+dlname
return '' if !fs.existsSync file
dlFolder = rootDir+'static/dls'
fs.mkdirSync dlFolder if !fs.existsSync dlFolder
dlName = Random.id()+'.rar' #Possible improvement: detect file extension
dlPath = dlFolder+'/'+dlName
ncp file, dlPath, () ->
Fiber(()->
Meteor.setTimeout(() ->
fs.unlink dlPath
, 1000*timeout)
).run()
"/dls/"+dlName
Perhaps I will make a package for this. Let me know if you could use something like that.
Related
How to access TFS build agent folder path in using batchfile?
I am calling runscript tool from build workflow (calling windows batchfile).
I tried to use the environment variable BUILD_REPOSITORY_LOCALPATH ($(BUILD_REPOSITORY_LOCALPATH), $env:BUILD_REPOSITORY_LOCALPATH) but they dint give any result.
Need some assistance on this.
I used another workaround for this instead of getting by workspace. From sourceDirectory folder i drill down to find my solution project file, the path containing my solution project is the local directory path i need
From batchfile i call my exe and pass %TF_BUILD_SOURCESDIRECTORY% as parameter.
string[] subdirectoryEntries = Directory.GetDirectories(targetDirectory)
// targetdirectory would be the input parameter from batch file`
for (int i = 0; i < subdirectoryEntries.Length ; i++)
{
// My root folder always contains a specific folder with name MyFolder
// and a file Myfile.sln
if (subdirectoryEntries[i].ToString().ToLower().Contains(#"MyFolder"))
{
Console.Writeline("My source code path is " + targetDirectory);
}
//Similarly I check for Myfile.sln and then get my path.
}
This may be a very crude way, this worked for me.
The variable you are looking for is TF_BUILD_SOURCESDIRECTORY. Please refer to the XAML build documentation.
TF_BUILD_SOURCESDIRECTORY: The sources sub-directory of the build agent working directory. This directory contains your source code. For example: C:\Build\BuildBot3\CoolApp\CIBuild\src.
If you want to get C:\TFS_Build\src\V9, it's just local path: the path that you have mapped the server path to on your machine. There is not any built-in TF_BUILD environment variables could achieve your requirement.
You could use TFS API to get the related info, first get the workspace information for the build server's workspace, then do the get option, a sample code for your reference:
// Get the workspace information for the build server's workspace
var workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(sourcesDirectory);
// Get the TFS Team Project Collection information from the workspace cache
// information then load the TFS workspace itself.
var server = new TfsTeamProjectCollection(workspaceInfo.serverUri);
var workspace = workspaceInfo.GetWorkspace(server);
Once you have a workspace, you can query it for the path mappings. It
will do the necessary translation from server to local path based on
your workspace mappings. For example:
workspace.GetServerItemForLocalItem("C:\TFS_Build\src\V9");
and
workspace.GetLocalItemForServerItem("$/DEV/V9");
This mechanism will only work, however, if your build definition
actually sets up the workspace to include these files.
More details please refer this similar question: How do I resolve the root and relative paths of TFS folders on the server?
update from OP:
From sourceDirectory folder i drill down to find my solution project
file, the path containing my solution project is the local directory
path i need
We have a web app. This web app is installed for each client of ours in a different folder in our VPS. We also have a separate folder with the base files of the web app (all code up to date).
The problem we're having is: we need to automate the update process of the web app for all client installations. Therefore, if we add files to the base web app, or move files, or create a directory, or remove a file or directory, these changes should be reflected automatically (applied to) on every client installation of the web app. Currently we're on beta and each code update results in a manual update of all files for each client installation using FTP, and the more changes done, the more time this process takes and the more complex it becomes.
Is there a tool available to automate this kind of process? Or if not, how do you suggest it should be approached?
/
/clients
/client1.domain.com
/[web app subfolders and files...]
/client2.domain.com
/[web app subfolders and files...]
/client3.domain.com
/[web app subfolders and files...]
/base_web_app
/[web app subfolders and files...]
So basically, each time we do any changes to the contents of /base_web_app, those changes should be automatically applied (sync) to the web app installations inside /clients (that is, /client1.domain.com, /client2.domain.com, /client3.domain.com).
It is also important to note that we need some files and/or subfolders to be ignored/not overwritten. Mainly configuration files specific to each client's installation.
Check out rsync: http://rsync.samba.org/examples.html It is a tool to synchronize files from one area to another (say your staging area to your production area). You can use patterns to specify what to sync and what to exclude, and it only copies changed files.
On your staging area (where you have the latest changes you want to sync), you could do something like this:
# sync staging area base_web_app directory to production base_web_app
# this syncs the entire local base_webapp directory to remote /base_webapp
rsync -avRc base_webapp server:/
# sync staging area base_web_app files to clients/client* directories, excluding the config directory
# this syncs the entire base_webapp to each remote client dir, excluding the config dir
rsync -avRc --exclude 'config/*' base_webapp server:/clients/client1.domain.com
rsync -avRc --exclude 'config/*' base_webapp server:/clients/client2.domain.com
rsync -avRc --exclude 'config/*' base_webapp server:/clients/client3.domain.com
I have this error message:
Preparing to deploy: Created staging directory at:
'C:\Users\leet\AppData\Local\Temp\appcfg4768292050846213939.tmp'
Scanning for jsp files. Compiling jsp files. Scanning files on
local disk. java.io.IOException: Jar
C:\Users\leet\AppData\Local\Temp\appcfg4768292050846213939.tmp\WEB-INF\lib\appengine-api-1.0-sdk-1.7.7.jar
is too large. Consider using --enable_jar_splitting.
I issued the command like this, but it does not work with --enable_jar_splitting.
"C:\Program Files\Java\jdk1.7.0_17\bin\java.exe" -Xmx1100m -cp
"%~dp0..\lib\appengine-tools-api.jar"
com.google.appengine.tools.admin.AppCfg --enable_jar_splitting -e
user#domain.com update "C:\myfolder\myproject\war"
Any comment?
The Java App Engine 1.7.7.1 SDK has been released to address this windows specific issue.
The Google Eclipse plugin has been updated, as well as the Google App Engine Maven artifacts and plugin (just use the 1.7.7.1 version).
to solve the library error message, you have to do this:
1) open your windows explorer and locate it to your eclipse folder. e.g. ".\eclipse\plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\user".
2) you will then see a file called "appengine-api-1.0-sdk-1.7.7.jar", rename it to "appengine-api-1.0-sdk-1.7.7.original". (just don't delete as you need in future)
3) copy that 2 files you created earlier - "appengine-api-1.0-sdk-1.7.7-1.jar" and "appengine-api-1.0-sdk-1.7.7-2.jar" and paste into this folder.
4) switch it eclipse ide, clean the project and rebuild it. then, the error message will go away.
i solved the issue by splitting the "appengine-api-1.0-sdk-1.7.7.jar" file my own.
in case anyone else want to know how to do that, follow these steps
1) unzip "appengine-api-1.0-sdk-1.7.7.jar" file from 7z.
2) balance them into 2 folders (each about 15mb) regardless any structure.
3) name the first folder as "appengine-api-1.0-sdk-1.7.7-1" and second folder as "appengine-api-1.0-sdk-1.7.7-2".
4) make sure you have jdk installed. e.g. "C:\Program Files\Java\jdk1.7.0_17\bin". set it to environment so you can run the file from that bin folder.
5) IMPORTANT: you must go into that first "appengine-api-1.0-sdk-1.7.7-1" folder and not at the parent folder of those folders.
6) launch cmd.exe and type "jar cf appengine-api-1.0-sdk-1.7.7-1.jar *" for the first archive.
7) do it again the same for the second archive (repeat step 5 and step 6).
8) go to \war\web-inf\libs folder, delete the existing appengine-api-1.0-sdk-1.7.7.jar.
9) copy and paste the appengine-api-1.0-sdk-1.7.7-1.jar and appengine-api-1.0-sdk-1.7.7-2.jar into \war\web-inf\libs folder.
10) now deploy it. it should work like charms!
EDIT:
Spelling correction.
Using that instruction :
To clarify, we're going to release a minor update for 1.7.7. For the
meantime, you can re-jar the file as follows:
cd to the working directory
$ jar xf somewhere\appengine-java-sdk-1.7.7\lib\user\
appengine-api-1.0-sdk-1.7.7.**jar
$ jar cfm somewhere\appengine-api-1.0-sdk-1.7.7.**jar META-INF/MANIFEST.MF *
and replace the old jar with the newly created one.
from http://www.mail-archive.com/google-appengine#googlegroups.com/msg67954.html
and the messages from the solutions here, I was able to make it work like this :
Open a command line and go into the bin directory of your JAVA installation where the jar.exe file is
cd "C:\Program Files\Java\jdk1.7.0_17\bin\"
Then, you need to find the file "appengine-api-1.0-sdk-1.7.7.jar" somewhere on your computer. It's at 2 places (not counting the temp directories), in the \war\WEB-INF\lib folder in your eclipse project and also in the "plugins" folder of your eclipse installation. Precisely there : \plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\
You just need one of those 2 paths.
Now in the command line, just type :
jar xf "C:\whatever-folder-your-eclipse-is-in\plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\user\appengine-api-1.0-sdk-1.7.7.jar"
and then
jar cfm "C:\whatever-folder-your-eclipse-is-in\plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\user\appengine-api-1.0-sdk-1.7.7.jar" META-INF/MANIFEST.MF
Now, if you go to that folder and check the .jar file, it should now be 11 mb instead of 30 something. Now you need to copy this one and replace the same jar in your webapp folder in \war\WEB-INF\lib\ so that both jars named "appengine-api-1.0-sdk-1.7.7.jar" have a 11 mb size.
Now the error should be gone and you don't have to split anything.
I'm having trouble finding documentation regarding this. After some googling I find that bin, conf,logs, temp, webapps, work are directories that should exist in CATALINA_BASE.
temp, logs, webapps, bin and work I don't have any trouble understanding.
bin I suppose is just another bin folder, if for some reason both CATALINA_HOME and CATALINA_BASE are in PATH, then scripts in both folders will be available for execution.
But how about conf? Will the content of CATALINA_HOME/conf be totally ignored if CATALINA_BASE is set? Suppose I only would need to customize only a few config files pr. CATALINA_BASE, would I still need to keep a complete set of config files in CATALINA_BASE/conf, or could the standard config files in CATALINA_HOME/conf be shared?
And ditto for CATALINA_BASE/lib ... would this work as a "global" lib folder pr. instance?
You can find the answer in the Tomcat documentation:
http://tomcat.apache.org/tomcat-6.0-doc/RUNNING.txt
Advanced Configuration - Multiple Tomcat Instances
In many circumstances, it is desirable to have a single copy of a
Tomcat binary distribution shared among multiple users on the same
server. To make this possible, you can set the $CATALINA_BASE
environment variable to the directory that contains the files for your
'personal' Tomcat instance.
When you use $CATALINA_BASE, Tomcat will calculate all relative
references for files in the following directories based on the value
of $CATALINA_BASE instead of $CATALINA_HOME:
bin - Only setenv.sh (*nix), setenv.bat (windows) and tomcat-juli.jar
conf - Server configuration files (including server.xml)
logs - Log and output files
webapps - Automatically loaded web applications
work - Temporary working directories for web applications
temp - Directory used by the JVM for temporary files (java.io.tmpdir)
Note that by default Tomcat will first try to load classes and JARs
from $CATALINA_BASE/lib and then $CATALINA_HOME/lib. You can place
instance specific JARs and classes (e.g. JDBC drivers) in
$CATALINA_BASE/lib whilst keeping the standard Tomcat JARs in
$CATALINA_HOME/lib.
If you do not set $CATALINA_BASE, $CATALINA_BASE will default to the
same value as $CATALINA_HOME, which means that the same directory is
used for all relative path resolutions.
While working on my GAE project under my dev environment, whenever I upload data to my dev datastore, the logfiles are stored in my current directory, for instance:
C:\dev\ls
bulkloader-log-20090912.104643
bulkloader-log-20090912.104648
bulkloader-log-20090912.104731
bulkloader-log-20090912.105526
bulkloader-log-20090912.110428
bulkloader-progress-20090912.104648.sql3
bulkloader-progress-20090912.104731.sql3
bulkloader-progress-20090912.105526.sql3
bulkloader-progress-20090912.110428.sql3
project
project is my GAE app. The above is generated when I run the command appcfg.py upload_data. Is there a way to tell GAE where to store those log files, for instance in a log folder.
Use the --log_file=... option to appcfg.py, as documented here: with this command line option you can give the complete path to the log file, including folder and name. (You cannot give JUST the folder and let it figure out the name; for that, you need to write a tiny script that figures out the name then calls appcfg.py).