Background:
We run a content management platform that hosts 20+ separate websites - some intranets and some internet sites - so that have different end points routed for internal or external access.
We are currently upgrading our infrastructure - including software versions, hardware, changes of IP/VIP/DNS entries etc which affects all of the sites.
I want to be able to run a repeatable site test against all sites check everything is working fine and I'd like to do it from different end points (locally on each box in the cluster, from the cluster level, from the internet, from the intranet extra.
Anyone know of a simple tool that requires no sofware to install to run a repeatable regression test against a whole bunch of defined URL's?
I was thinking of a HTML page that I can run from different locations that is essentially a link checker.
Can anyone recommend a simple way to provide a level of automatic testing of our sites (in addition to our manual verification.
Thanks
Sounds like you're looking for Selenium: http://seleniumhq.org/
Edit
Wait, I think you mean 'Testing' them as in, check to see if they're online and reachable? Then I might just automate a series of ping or telnet commands, and check appropriate things. Would take a matter of minutes to write a little app in any language to do this.
There is all sorts of web site monitoring software available (check google, or ask for recos here). That's what you're looking for. There is a whole range from free to very expensive that monitor and stress your site from around the world.
Or you can write simple shell scripts that do what you want.
>> 'Testing' them as in, check to see if they're online and reachable?
Yes that's exactly it!! I was thinking the same - I could script something up but I thought I'd check first to see if someone has already done this - I guess not!
Thanks
Doesn't fit the "requires no sofware to install to run" part, and it's not necessarily super-cheap, but we've had great results with Radar Website Monitor for this kind of thing.
Related
I have 2 simple GUI-test on a WPF application. They work alright from VisualStudio.
Now I am trying to also do that in my Azure pipeline. Ik keep on bumping against the same error right at the start:
"OpenQA.Selenium.WebDriverException: An element could not be located on the page using the given search parameters"
Basically I am working along this line: WinAppDriver in CI with Azure Pipelines. One difference with the example is that I have put my tests in my build pipeline instead of my release pipeline. As far as I can determine that should not be the cause of my problem, using the same Windows-2019 agent.
This is how the pipeline looks like, up to the tests:
In the past I have had problems from VisualStudio similar to those decribed here, but I have left those behind. FindElementByName - Element couldn't be located
Nevertheless I have tried to improve on these aspects by using DefaultWait. It works fine from VisualStudio but has not helped me on Azure.
I read a couple of times about an 'interactive mode' for the agent. But I don't see anything about that in the guideline, and it seems to apply to self hosted agents. I also don't see any configuration option on the standard agent. So I am confused about that.
I am lost here.
Could someone tell me what I am missing?
It would be much appreciated, I have been spending quite some time on this already.
This took me a while to find the necessary information and piece together the solution, which turned out to work.
The key thing is using an interactive agent, which has to be self hosted. Key instruction for me was following Self-hosted Windows agents. This resulted in installing an environment on my own hardware, and to be pretty simple and straight forward.
This implies one has to startup this environment with Powershell, and probably guard the graphical test as it executes. For an one man team as mine that is no problem, for a larger team this may be an issue.
An additional advantage is that the entire building and execution environment is now under one's own control for overview and inspection.
It's funny to push a commit and see the pipeline progress on Azure, plus starting up the graphical test on my own hardware.
So my company installed PostgreSQL on my computer, which I use, rarely and without understanding, for one specific function.
I'm trying to follow Lynda etc. tutorials to understand (Postgres)SQL better, since that's what we use, but all the tutorials ask students to reconfigure certain aspects of their system in order to follow along with example files (which I would really like to do).
Since I've messed up my dev env once already, I'm hesitant to touch anything that will cause issues with the local versions of our project.
I know this is an extremely wide-angle question with no easy answer, but if anyone has any general advice for playing with sample databases in MAMP Pro (or anywhere else) using Postgres without interfering with the servers I'm currently running, it would be a huge help.
i would recommend you use Vagrant and set up a isolated postgresql instance. Here is a great wiki you can follow to do this.
UPDATE: Given your comment,An easy solution is to just backup your data and proceed trying out the the Postgres examples you can always restore your data after you are done..
I have been looking at the various Meanstack frameworks out on the net - and whilst impressed with what they achieve I have one serious concern - the number of files used in a typical stack - meanstack.js uses over 15000 files whilst the bmean example has a modest 1900 in comparison.
The question I am asking myself is would I be happy to put my trust is such a system from a production view point - what happens when something goes wrong how easy is it going to be to find the answer? You can almost bet that when your most important customer logs on it is going to go haywire. Also what happens when Angular version 2 comes along it could require a complete rewrite but by then the stack your using has been customised and difficult to change?
Am I getting over concerned about the technology - my intended approach is to strip the client side code out of the bmean example and rewrite it with my own - at least that way I know (and control) what goes on in the client. Do you think this is the correct way to proceed?
With most systems there is a bit of preparation required before going to production. The same is true with mean.io (using multiple cpu's, improved aggregation, caching, etc etc)
The large number of files is essentially a product of the way npm handles dependencies. Each module is able to define independent versions of the same dependencies thus creating a bit of bloat but at the same time allowing a lot of flexability in nodejs code.
We currently have a number of mean.io projects in production phase and have been very happy with performance and the overall experience.
New releases of the project are scheduled every couple of months, upgrading should not be too much of a problem if you use the package system correctly.
Issues with the project are handled and managed through github issues additional support can be found on our irc (freenode #mean_io) channel as well as on facebook.
For commercial support have a look at the support page
I am currently investigating possible options of a migration framework/tool. I like the idea of ruby migrations on which the above frameworks are based.
So I am asking for your experience, opinions and maybe a comparison between them. Are you using them in production?
thanks for responses. The goal of this question was to get a feeling about which tools is used most in the developer community but it seems that migrations are not a hot topic here.
Anyway, I have decided to go with MigSharp as the codebase seem to be pretty clean and it is quite easy to handle and had build in support for MS SQL CE. Second runner up would have been FluentMigrator where I was not able to produce a working example for compact edition.
Cheers
I use FluentMigrator in production, and am a longtime contributor to FM. I think your question is to general; be more specific. Also, FM has a google group which is fairly active if you want FM information.
FM is derived from migrator.net, as I recall. It uses a fluent-syntax, and supports multiple databases. We have taken some inspiration from rails migrations, but it's definitely not a port. Worth checking out.
One thing I've learned is not to put your migrations in the same assembly as you app code. Separate them into a migration assembly, and use that for migrating your databases.
Also, you should always work on multiple environments to avoid problems with migrations run straight against production. I always have at least a development and production environment, and most of the time there is a testing environment as well.
I use mig#.
It works well, but you will need to have some guidelines for usage - as migrations can get complicated.
We use sequence number on the end of our migrations rather than a date-time stamp. This is because we don't know when the date time stamp was set (when they begun the source code change-set; just before committing; some time inbetween) different developers could use different approaches.
Names such as Migration_0000034.cs give you plenty of space.
At this point, I would stick with migrator.net. I like the promise of FluentMigrator, but it seems to not have any better active development than migrator.net (see the issues and pull requests that have languished on their github site).
There is also no easy way to do an ExecuteScalar(). I'd add it, but I don't want to create my own fork, and I see no reason that a pull request would actually land in the master. (Execute.WithConnection is an Action so it will fire on demand rather than when I need it to fire)
So for me, I'm heading back to migrator.net.
I know how to write programs in Java and C++, and would like to learn how servers, databases and Internet based applications work so I could start developing them.
Where should I start? What should I learn first? What books would you recommend for me?
Thank you, in advance :)
I would start by either trying Tomcat which would let you create fairly basic web applications. I would start by playing around with either servlettes or JSPs. There is a lot documentation and examples.
Or you could start by downloading and playing around with a database. PostgreSQL is really good. It is free and they have a tool called pgadmin which is a really good ide.
Once you have been able to get these set up and working I would then start taking a look at some various frameworks that exist to make using these tools a lot easier. For example, you could take a look at Guice or Spring for dependency injection or a range of other tools. This is a comparison of each.
You will also probably want to also look into Velocity, Freemarker, or struts, or something similar. These will make your life a lot easier.
For the database you could look at: Hibernate or MyBatis, both are really good and function slightly differentially. Hibernate is very commonly used and they cache objects very efficiently.
I don't know what you mean by "cells", anyway you may start from open source technologies and their online docs, like Apache, MySQL, and PHP.