We have run into a situation where due to our SQL server being down a whole lot of azure logic app runs have failed.
What is the best way to bulk restart them, without doing it manually one by one in the UI? Is it possible to do that using CLI / REST APIs?
For now you could bulk resubmit Logic Apps from the Runs Dashboard. Select the runs that you want to resubmit and click on the Resubmit button at the top right.
Related
I have a Azure Automation Runbook written in Powershell. It takes a SQL DB every week and creates a backup to blob storage. This has been working for over a year and abruptly stopped producing .bacpac files to blob last month.
Nothing appears to be wrong with the code as it has not changed and running it in the test pane with the proper parameters, which have also remained unchanged, completes successfully. When the scheduler calls the job, the runbook is called and runs with no errors or warnings, just does not produce the .bacpac file to blob storage.
Has anyone run into this problem or similar? I am looking for suggestions on where to be checking within portal, sql or wherever to resolve this issue. I cannot seem to find the problematic moment via activity log.
I am getting this report not running, going to the database. I have to login to get it to work and it will work, but I don't want the users to have to log in. I have checked the connection. It connects from the boxes to the database in odbc. This report completely works before it is promoted. It is promoted from 4.1 to 4.1. Other reports using the same database are working, but others are still giving this prompt as well. Most are working, and they are using the same connection in BOE. This exact report works in Dev. Do you have any ideas?
In BOE when you upload the report, Right click and go to database configuration there select appropriate driver and provide database login credentials.
After providing the details select the radio button that will tell use the same database details every time... probably 3rd option and click on save.
Hi I fixed it by reinstalling the native client
I have a strange problem that is reoccurring. My app is a MVC5 application using EF code first and SQLExpress. My solution uses 4 databases - 3 of which work without a problem. The fourth database is constantly 'acquired' by 'System'. Everyday for the last week when I boot up I have a problem.
When I run the app and the DatabaseInitializer runs I get this exception :
Cannot open database "databaseName" requested by the login. The login failed. Login failed for user 'Computer\UserName'.
Weird. I read some S.O. articels about changing permision, but that smells to me. This database shouldent be in someone elses domain in the first place! The other databases work fine !
So I go about deleting the database. I get this error.
Error deleting file 'database.mdf'. The process cannot access the file because it is being use by another process.
So I reboot and go to delete the database directly from the windows explorer.
The action can't be completed because the file is open in System
I was finally able to delete the file by running my PC in safe mode and deleting it there. Im not sure what the cause of this is. I am running SQL express and I don't see any SQL background services running.
What is the easiest way to sync or replicate files over a network? I am running a few servers in a rack for load balancing and wish to mirror content.
For example, I have Server 1 where I do my FTP uploads to. Server 2 is another node. The manual way is for me to also do FTP uploads to Server 2 but I am sure it can be done automatically without my presence or without the user logged in.
I have tried SyncToy but it just doesn't run when the user is not logged in. I have to manually run it.
Are there better ways? I saw Microsoft DFS but it is just too complicated for me to set up.
Try SyncBack.
There's a light-weight version (SE) which is free and a Pro version.
What is the most secure and easier way to send approx. 1000 different records into database that is not directly accessible - MySQL database on Web provider's server - using Windows application
.
Data will be stored into different tables.
Edited:
The application will be distributed to users who have no idea what is database or putty or... They just install my application, open it, enter some data and press Submit.
Currently I'm using php to upload the generated script into webserver and there process it. I think I should also include some signature to the file to avoid some "drop..." hacks.
If you can export the data as a sql script you can just run it against the remote server using your application of choice. 1000 records wont create that big a script.
In current project on my job we have the same situation - remote (faraway) database.
I made next solution: serialization sql query into xml and putting it via HTTP to web daemon, which is running on remote server instead of open sql server. Daemon checks credentials and executes query.
As I can't execute any external programs on external server, I created following solution:
My program creates script file and calculates it's salted hash
Program sends this file together with user credentials and hash into PHP page on the server
PHP page checks the username and password, then checks hash and then executes script. Only Insert and Update commands are allowed.
Is this approach secure enough?