How to implement a tool to validate content and push it to the cloud - database

I am currently committing text files and images to a GitHub repository that need to have a certain format and content style, with validation done via pre-commit hooks, so all files committed to the repository are valid. I also need git to keep track of when files are updated and the versions that existed previously.
In the future I want to move to storing the files to a cloud service instead of the repository. This is the solution I though of:
Have a script that needs the directory you are trying to upload, name would need to be a certain format, ex. <City><Street>.
If it exists, the script compares the folder contents to the one in the cloud, if not all the folder gets uploaded.
Before upload we run content format validation, if it doesn't pass then we throw errors to the user.
If there were previous file versions, we store them in a different folder, appending the date/time to the filename.
Cloud now has the latest version.
I would lose a lot of the advantages that I had with version control and the pre-commit hooks before. I would gain the ability to just pull a specific folder, something that GitHub doesn't allow me to do. What would be a better way to implement this? Is there a tool that would be good for this?

Related

How to build an Alexa skill with multiple developers?

I'm struggling to handle the pipeline building an Alexa skill across several developers and existing docs just aren't cutting it.
We have four developers and when we check our code into our git repo, checkout new branches and so forth, we're continually overwriting our .ask/config and skill.json files.
How do we set this up to avoid overwriting? Ideally we're all building towards the same Alexa skill but we'd each like to test in our own instance -- separate skills and separate lambda functions.
As soon as I grab another developers branch, I lose my necessary config and skill files.
My gitignore has these files ignored, but since they're checked in they're continually being tracked.
How do I handle multiple developers?
I see several problems here.
First of all - clean up your repo: make sure that all developers have ./ask/* entry added to their .gitignore files and ./ask directory is removed from the origin.
To solve overriding problem - you can create a template-skill.json with placeholders for lambda's ARNs and all the other things different for each developer. Then, before ask deploy just create the valid skill.json file by running some script that replaces placeholders in the template JSON with your data (kept in another gitignored file).
Setup the same in your CI instance with configurations for different environments.

How to update .env file and share among teammates?

I created .env file with params, pushed to github, my teammates downloaded repo. In next push I added .env file to .gitignore. Now I need to make changes to .env file, but how they will get it if .env ignored. What is the right way of doing such of manipulation?
UPDATE:
I used two libraries to manage env variables:
https://www.npmjs.com/package/dotenv
https://www.npmjs.com/package/config
You do not store configured .env file in repository but instead, you create .env.dist (or anything named like that) and commit that file. Your .dist file can contain all keys commented out, or all keys and default values. It's all up to you but you need to ensure your template do not contain any sensitive data too:
DB_HOST=
DB_USER=
The main benefit is that you do not have .env in the repo, so each developer can easily setup own system as he likes/needs (i.e. local db, etc) and there's no risk of having such file accidentally overwritten on next pull, which would be frustrating.
Also (again), you do not store any sensitive data in the repository, sowhile your .env.dist can and maybe even should be pre-configured to your defaults you must ensure any credentials are left empty, so noone can i.e. run the code against production machine (or figure out anything sensitive based on that file).
Depending on development environment you use, you can automate creation of .env file, using provided .env.dist as template (whcih useful i.e. with CI/CD servers). As dotenv file is pretty simple, processing it is easy. I wrote such helper tool for PHP myself, but it is pretty simple code and easily can be ported to any other language if needed. See process-dotenv on GitHub for reference.
Finally, if for any reason config setup is complicated in your project, you may want to create i.e. additional small script that can collect all data and write always up to date config file (or upgrade existing etc).

How to keep multiple versions of an artifact distinguished by properties in Artifactory?

is it possible to somehow keep different versions of the same file in Artifactory, where the versions are differentiated by their properties?
For example, let's have a file foo.
I upload the file to Artifactory via the REST API and set ver=1 property.
The file changes, I upload it again, this time it has ver=2 property.
Later I try to access the ver=1 file, but get a 404 error.
I understand that Artifactory keeps different versions of Artifacts which are associated with different builds. However there is no build info other than the "custom property" for the files I am uploading. How can I version them?
You must make sure that each artifact is also deployed with a unique path/file name. It is not enough to have a different set of properties.
Usually the best way to version the file will be having the version number as part of the file name and possibly also as part of the path.

Indesign real-time package for collaboration

I manage a team of designers working on Indesign.
When we work on a project, it often happens that a designer has to work on the project of another. We work with Dropbox for Business.
But when we take the work of another designer, there is often missing links and fonts.
Is there a plugin or a way to develop a plugin that would allow, when we create a new indd file (or for the protection of the same file):
Automatically create a "Links" folder and another "Document fonts" at side of the indd file
Systematically add a new link or new typography in the corresponding folder?
To simplify: each action on font or on a link, make a kind of "Indesign Package" in real time?
If this is not a solution, do you have any solutions to meet this need?
I don't know of a specific script or plugin that does this.
However, it should be possible to write a script with an eventhandler with a beforeClose event that runs certain script commands every time a user closes a document (or even every time a user adds, changes or deletes a link). At this point the script could run some copyLink commands on all the images and fonts (?) placing them all in the folders next to the document.
The whole script could be made a startup script, so it becomes active anytime any user runs InDesign.
(I'm actually not sure, if fonts can be copied so easily. Worst case scenario would be that the script would need to run some packaging command to gather the fonts somewhere, copy them over to where you need them and then delete the rest of the temporary package.)
Did you consider Creative Cloud Libraries ? They are meant to allow sharing assets within a team. Apart form that, you users would need to have a same access to the file system (a common drive letter for the network path for example).
Another solution would be to use a DAM solution so users would link files from the DAM.
Eventually, you could sure think of a script as mdomino offered.

Delta changes in GAE

I'm wondering, when I press "deploy" in the google app engine launcher, how does it sync my changes to the actual instance.... maybe it would be better to ask specific questions :)
1) Does it only upload the delta changes (as opposed to the entire file) for changed files?
2) Does it only upload new files and changed files (i.e. does not copy pre-existing) unchanged files?
3) Does it delete remote files that do not exist locally?
4) Does all of this happen instantaneously for the end user once the app has finished deploying? (i.e. let's say I accidentally uploaded an insecure file that sits on example.com/passwords.txt - if #3 is true, then once I remove it from the local directory and re-deploy it should be gone- but can I be sure it is really gone and not cached on some edge somewhere?)
If you use only the launcher or the appcfg util as opposed to manage your code by means of git, AppEngine will keep only one 'state' of that particular version of your app and will not store any past state. So,
1) Yes, it uploads only deltas, not full files.
2) Yes, only new, modified or deleted files.
3) Yes, it deletes them if you delete locally and deploy. As Ibrahim Arief suggested, it is a good idea to use appcfg so you can prove it to yourself.
4) Here there are some caveats. With your new deploy, your old instances are sent a kill signal, and until it actually gets executed, there is a time span (seconds to minutes) during wich new requests could hit your previous version.
It is also very important the point Port Pleco has made. You have to be careful with caching on static files. If you have a file with Expires or Cache-Control headers, and it is actually served, then it could be cached on various places so the existence of old copies of it, is completely out of your control.
Happy coding!
I'm not a google employee so I don't have guaranteed accurate answers, but I can speak a little about your questions from my experience:
1) From what I've seen, it does upload all files each time
2) See 1, I'm fairly sure everything is uploaded
3) I'm not entirely sure whether it "deletes" the files, but I'm 99% sure that they're inaccessible if they don't exist in your current version. If you want to ensure that a file is inaccessible, then you can deploy your project with a new version number, and switch your app version to the new version in your admin panel. That will force google to use all your most recent files in that new version.
4) From what I've seen, changes that are rendered/executed, like html hardcoded text or controller changes or similar, appear instantly. Static files might be cached, as normal with web development, which means that you might have old versions of files saved on user's machines. You can use a query string on the end of the file name with the version to force an update on that.
For example, if I had a javascript file that I knew I would want to redeploy regularly, I would reference it like this:
<script type="text/javascript src="../javascript/file.js?version=1.2" />
Then just increment the version number manually when I needed to force deployment of the javascript to my users.

Resources