We have more than 10 skills for dry cleaners around the U.S., and they all receive the same updates, but we have to individually publish them. How can we set up a system that allows us to submit them ALL for review at the same time?
From the command line is a plausible option.
Thanks!
You can use ASK CLI to deploy skill with one command per skill
ASK deploy
you can choose to deploy lambda only
ASK deploy -t lambda
https://developer.amazon.com/docs/smapi/quick-start-alexa-skills-kit-command-line-interface.html
a better solution is to use a CI/CD pipeline
Related
I and a friend want to work on same project but we are in different location.
I will be working on backend while he will be working on frontend.
How do I feed him with my backend API. What's the best solution, app or tools to use.
Strongly recommend using git as a collaboration/version control tool. You can sign up for free at github.com and they now support private repositories. There's a bit of a learning curve, but git is highly adopted and one of the standards for managing code between several 100s or even 1000s of contributors across large projects.
Some of the basics:
1) think of git as a way to share code between developers
2) not only that, but you can manage change history and track changes over time
3) seamlessly manages most changes, enabling you and your team to view point-in-time versions
Check out the Git handbook at https://guides.github.com/introduction/git-handbook/ to get started!
To address your specific question:
when you are ready to share your backend code, check it into your git repository and let your collaborators know that updates are available
make sure to include instructions on how to use your backend code; do they run the server locally? is it deployed to a url? is it running in docker or kubernetes? is it authenticated, and how?
they will "pull" your changes and start working against them; when they have updates, they should commit them to git and push to the remote repository. You can then pull down their changes and review the full frontend/backend solution.
You can use these tools to make your life easier
Github or Bitbucket for code collaboration
Postman or Postwoman for API share
Jira cloud or clubhouse for Issue tracking ( free for 10 users)
Confluence for documentation.
Slack for real time communication.
These are tools I am using for collaboration with others. This is just my opinion.
I would like to disable/enable Build Controller (or Build Agents) from a bat file. I want to do this so we can schedule builds every night, but then disable them during code-freeze. "TFPT builddefinition /enabled:false" is close... but that is only for cloning build defs. If not, is there a way to disable checkins from a bat file? Then I would edit my Build Def and uncheck the box for "Build even if nothing has changed since the previous build".
Thanks
You can create a Rest call to the private and undocumented TFS API, but you should know what you are doing.
Or you can use a scheduled tasks to control your agent service installation on your build server.
But there are better ways to control your sources and releases.
It seems like the problem is your TFS project setup. For example, use
"GIT Hub Flow" with “Pull Requests” and no one can change the master
without a approved PR.
The developers can he developers can work and you don’t need plan a
“Code Freezes" or "removing permissions" or stuff like this.
I also wouldn´t stop the deployments for the dev and test systems.
If you want to avoid that anybody creates a release to a special set of environment (Stag and Prod) set an approvers to control the release process.
The understanding Git-Hub-Flow site
"GitHub Flow is a lightweight, branch-based workflow that supports
teams and projects where deployments are made regularly." GIT Hub
https://guides.github.com/introduction/flow/
I have an AngularJS site consuming an API written in Sinatra.
I'm simply trying to deploy these 2 components together on an AWS EC2 instance.
How would one go about doing that? What tools do you recommend? What structure do you think is most suitable?
Cheers
This is based upon my experience of utilizing the HashciCorp line of tools.
Manual: Launch an Ubuntu image, gem install sinatra and deploy your code. Take a snapshot for safe keeping. This one off approach is good for a development box to iron out the configuration process. Write down the commands you run and any options you may need.
Automated: Use the Packer EC2 Builder and Shell Provisioner to automate your commands from the previous manual approach. This will give you a configured AMI that can be launched.
You can apply different methods of getting to an AMI using different toolsets. However, in the end, you want a single immutable image that can be deployed. repeatedly.
I have developed an app in Twilio which I would like to run from the cloud. I tried learning about AWS and Google App Engine but am quite confused at this stage:
I have 2 questions which I hope to get your help on:
1) How can I store my scripts and database in the cloud? Right now, everything is running out of my local machine but I would like to transfer the scripts and db to another server and run my app at a predetermined time of day. What would be the best way to do this?
2) How can I write a batch file to run my app at a predetermined time of day in the cloud?
I understand this does not have code, but I really hope someone can point me to the right direction. I have spent lots of time trying to understand this myself but still am unsure. Tks in adv.
Update: The application is a Twilio app that makes calls to people, the script simply applies an algorithm to make calls in a certain fashion and the database is a mysql db that provides the details of people to be called.
This is quite difficult to provide an exact answer without understanding what is the application, what is the DB or what is the script that you wish to run.
I can give you a couple of ideas that might be helpful in such cases.
OpsWorks (http://aws.amazon.com/opsworks/) is a managed service for managing applications. You can define your stack (multiple layers like web, workers, DB...) and what are the chef recipes that should run in various points in the life of the instances in each layer (startup, shutdown, app deployment or stack modification..). Then you can use the ability to add instances to each layer in specific days and hours, to implement the functionality of running at predetermined times as you requested.
In such a solution you can either have some of your instances (like DB) always on, or even to bootstrap them using the chef recipes every day, with restore from snapshot on start and create snapshot on shutdown.
Another AWS service that you use is Data Pipeline (http://aws.amazon.com/datapipeline/). It is designed to move data periodically between data sources, for example from a MySQL database to Amazon Redshift, the Data warehouse service. But you can use it to trigger scripts and run random shell scripts that you wish (http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-object-shellcommandactivity.html), and schedule it to run in various conditions like every hour/day or specific times (http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-concepts-schedules.html).
A simple path here would be just to create an EC2 instance in AWS, and put the components needed to run your app there. A thorough walk through is here:
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/get-set-up-for-amazon-ec2.html
Essentially you will create an EC2 virtual machine, which you can for most purposes treat just like any other Linux server. You can install MySQL on it, copy your script there, and run it. Of course whatever container or support libraries your code requires will need to be installed as well.
You don't say what OS you are using locally, but if it is Mac or Linux, you should be able to follow almost the same process to get your script running on an EC2 instance that you used on your local machine.
As you get to know AWS, there are sophisticated services you can use for deployment, infrastructure orchestration, database services, and so on. But just to get started running a script from a virtual machine should be pretty straightforward.
I recently developed a Twilio application using Ruby on Rails for the backend and found Heroku extremely simple to setup and launch. While Heroku does cost more than AWS, I found that the time I saved using Heroku more than made up this. As an early stage startup, we wanted to spend our time developing important features, and not "wasting" time optimizing our AWS cloud.
However, while I believe Heroku is ideal for early-stage websites/startups I do believe hosting should be reevaluated once a company reaches a certain size. At some point it becomes economically viable to devote resources into optimizing an AWS cloud solution because it will be cheaper than Heroku in the long run.
I am new to CakePHP and I am making an application where users fill out forms and then other users who are specified on the form have to add to the data. At the end of each day I want to send an email to all users who have been referenced on forms that day and tell them how many new forms they need to add information to.
I know how to run my query to figure out who I need to email and how to construct the email, but how do I make it happen once a day or at any set time? I have found something about cron jobs in my research but I don't fully understand or know if that will work for me. I am working in a Windows environment and launching my app on a heroku server currently.
Thanks for any info!
Cheers,
Jon
Although this question is not really related to CakePHP but rather to Heroku, I suggest you install that Heroku Scheduler Addon.
Once installed, you can write a shell script, such as follows:
#!/bin/sh
php -f path_to/your_php_file/which_sends_emails.php
and name it sendemailjob.sh or something. Make it executable by
sudo chmod +x sendemailjob.sh
After that, you just need to tell Heroku Scheduler to daily execute that file. Should not be too much magic.
Although I am not quite sure whether you actually have shell access since you're on Windows, maybe there is a different solution for Windows.