Permissions of a windows service regarding modified timestamp reading - network-share

This question was migrated from Super User because it can be answered on Stack Overflow.
Migrated 21 days ago.
I have a Qt app running on Windows Server 2016 that monitors files on a mapped network drive for changes via QFileSystemWatcher. Change notifications do not work, so I want to rely on regular "lastModified" timestamp polling. The problem is: If I run the app as a desktop app, the timestamp is read out correctly. If I run the same app as a service, the timestamp can't be read out. And I have to run it as a service, to keep it alive permanently.
Is this a restriction of a Windows service?
If yes, is there a workaround to read out the timestamp?
Here is how I try to read the timestamp:
for ( int i=0; i<conf_.files_.count(); i++)
{
QString f = conf_.files_.at(i);
QFileInfo finfo(f);
if ( finfo.lastModified().isNull() || !finfo.lastModified().isValid() ){
QString line(conf_.iniFileGroupName_ + ": Could not read lastModified-timestamp of file " + f);
ApplicationLogger::instance().log(line);
// TODO: Use winapi directly and see if it helps
// https://learn.microsoft.com/en-us/windows/win32/api/fileapi/nf-fileapi-getfileattributesa
}
else{
if ( !lastModified_.contains(f) ||
lastModified_.value(f) < finfo.lastModified() )
{
QString line(conf_.iniFileGroupName_ + ": Adding file " + f + " to backlog based on modification timer. Last modified = " + finfo.lastModified().toString());
ApplicationLogger::instance().log(line);
addToBacklog(f, finfo.lastModified());
lastModified_.insert(f, finfo.lastModified());
checkBacklogAndCopy();
}
}
}
Ideally I'd like to get notified and not poll, but this is out of scope for this question.

Related

Correlation Failed, Remote Login. AspNet Core Identity Server

Trying to gain some basic understanding of how this process works as I am receiveing the Correlation failed error. Let me first begin my describing the issue I'm encountering...
QAT is not working properly and is configured as follows:
I have an Identity Server running behind a load balancer for QAT.
All requests sent to the load balancer are https.
The traffic being forwarded to each application server (2 seperate servers in this case) is http.
The Netscaler is adding all necessary X-Forwarded items to the header.
I have another application that also sits behind the load balancer for QAT.
There are 2 seperate servers hosting this application which the netscaler will forward the traffic to.
This application is configured to use the X-Forwarded info from the netscaler.
It is designed to authenticate using the above-mentioned Identity Server.
My issue is that I end up with a never ending loop between the second application and the Identity Server when I deploy to QAT. This is strange to me as my SYS environment works perfectly. My sys environment has a seperate instance of Identity Server and the second Application mentioned (except that there is only a single instance of each application being forwarded to). This also goes through the netscaler and does all the X-Forwarded magic mentioned earlier.
In both situations the setup is identical. The only difference is that QAT has multiple servers hosting each app and SYS only has 1 server hosting each app.
My question is why would this behave differently?
Why would this work in sys but not in qa?
I think at this point we can rule out the callback path, cookie settings, etc... b/c it work in SYS.
Could it be that I need to implement some sore of Data Protection Key middleware in both the identity server and the other applciation? On that note, I really dont understand the Data Protection Keys. Would both the identity server and the seperate application need to store their Keys in the same location (whether that be in the database or filesystem) in order to be able to decrypt the information stored in the cookie...
Any help is greatly appreciated.
It was definitely the Data Protection Keys that were the issue. My solution was simple. Save the encryption key as part of the deployment process, Create an IXmlRepository, and then add that to the startup. Easy Peasy.
using Microsoft.AspNetCore.DataProtection.Repositories;
using System;
using System.Collections.Generic;
using System.Text;
using System.Xml.Linq;
namespace Myapp.Encryption.Repositories
{
public class EncryptionRepository : IXmlRepository
{
private String Key { get; set; }
public EncryptionRepository()
{
var year = Convert.ToString(DateTime.Now.Year + 2);
var key = "<key id=\"983440f7-626b-46e4-8bfa-7c3d6d9d4619\" version=\"1\">" +
" <creationDate>2019-11-13T17:42:58.889085Z</creationDate>" +
" <activationDate>2019-11-13T17:42:58.3843715Z</activationDate>" +
" <expirationDate>" + year + "-02-11T17:42:58.3843715Z</expirationDate>" +
" <descriptor deserializerType=\"Microsoft.AspNetCore.DataProtection.AuthenticatedEncryption.ConfigurationModel.AuthenticatedEncryptorDescriptorDeserializer, Microsoft.AspNetCore.DataProtection, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60\">" +
" <descriptor>" +
" <encryption algorithm=\"AES_256_CBC\" />" +
" <validation algorithm=\"HMACSHA256\" />" +
" <masterKey p4:requiresEncryption=\"true\" xmlns:p4=\"http://schemas.asp.net/2015/03/dataProtection\">" +
" <value>{{Your Encryption Key }}</value>" +
" </masterKey>" +
" </descriptor>" +
" </descriptor>" +
"</key>";
Key = key;
}
public IReadOnlyCollection<XElement> GetAllElements()
{
var collection = new List<XElement>();
collection.Add(XElement.Parse(Key));
return collection;
}
public void StoreElement(XElement element, String friendlyName)
{
// Not required as key is hard coded
}
}
}
services.AddSingleton<IXmlRepository, EncryptionRepository>();
services.AddDataProtection().AddKeyManagementOptions(a => a.XmlRepository = (services.BuildServiceProvider()).GetService<IXmlRepository>());

How to connect to Couchbase 2.0+ using lcb_create C API?

I am returning to a C API Couchbase 2.0 project which was working a year ago and now fails to connect to the database, on a computer and source code that I believe has not changed, except for Ubuntu updates.
I now get the following error when I attempt to call lcb_create:
Failed to create libcouchbase instance: The administrative account can
no longer be used for data access
Have lcb_create parameters or behavior changed? What do I now need to provide?
My code is:
// Create the database instance:
lcb_error_t err;
struct lcb_create_st create_options;
struct lcb_create_io_ops_st io_opts;
io_opts.version = 0;
io_opts.v.v0.type = LCB_IO_OPS_DEFAULT;
io_opts.v.v0.cookie = NULL;
memset(&create_options, 0, sizeof(create_options));
err = lcb_create_io_ops(&create_options.v.v0.io, &io_opts);
if (err != LCB_SUCCESS) {
fprintf(stderr, "Failed to create libcouchbase IO instance: %s\n",
lcb_strerror(NULL, err));
return 1;
}
create_options.v.v0.host = Host;
create_options.v.v0.user = Username;
create_options.v.v0.passwd = Password;
create_options.v.v0.bucket = Bucket;
err = lcb_create(&Instance, &create_options);
if (err != LCB_SUCCESS) {
fprintf(stderr, "Failed to create libcouchbase instance: %s\n",
lcb_strerror(NULL, err));
return 1;
}
The Username I am passing is has been the Administrator name, and as I said this used to work. Reading around, it sounds like now we use the bucket name as the user name? Is the bucket field then redundant? And the password is now a bucket password? I don't see a place to set that - maybe I need to update Couchbase Server past 2.0, and the updates have made my API out of sync with the server?
You should use the bucketname and password to connect to the bucket, not the Administrator name. This is covered in the release notes for the Couchbase C client 2.1.1.
The use of Administrator credentials was never documented, was not a good security practice and was rarely used, so Couchbase decided to address the issue.

How to ping my website in Azure?

I wrote a Windows application to ping my website each 5 minutes to control whether it is UP or DOWN at the moment. it was working in our network and in our test server but our live environment is in Azure and it doesn't allow me to ping that web site.
What can I use instead of Ping to control my website in Azure? and how?
Use HttpWebRequest and HttpWebResponse and remember to Dispose the objects you create. In particular the response stream of HttpWebResponse.
HttpWebResponse res = req.GetResponse() as HttpWebResponse;
using (Stream respStream = res.GetResponseStream())
{
respStream.ReadByte();
respStream.Close();
}
I would suggest you take it outside of your own infrastructure. Mainly because ping'ing something can give you very limited information anyway. What if the role is up and running but your website is actually crashing with an exception? What if you have multiple instances of your role, one of which is down, and your ping request doesn't say anything is wrong?
Use something like Pingdom:
http://www.pingdom.com
It'll allow you to do a few things, that are probably not possible (or not easy) to do with your own 'inhouse' solution. Such as transaction monitoring (similar to very basic UI tests, have a user log in and click around for instance), multiple ways of alerting (even Twitter alerts) and multiple different request locations (it may work in the UK, but does it work when accessing it from France?).
Services like this were created for this sole purpose. You need to define when you believe your website to "be up" or "be down". Is it when it responds to ping's? Is it when your login page displays OK? Is it when your admin page displays OK?
I used HttpWebRequest and HttpWebResponse instead of Ping, but after 3 times the program freezes for some reason. I put everything in a try...catch block but it does not throw any exception either, only freezes.
try
{
var myRequest = (HttpWebRequest)WebRequest.Create(url);
var response = (HttpWebResponse)myRequest.GetResponse(); //After third time it freezes here
if (response.StatusCode == HttpStatusCode.OK)
{
labelResult.Text += TxtIPAddress.Text + " is Available " + " " + System.DateTime.Now.ToString() + " " + Environment.NewLine;
}
else
{
labelResult.Text += TxtIPAddress.Text + " is Unavailable " + System.DateTime.Now.ToString() + " " + Environment.NewLine;
}
}
catch (Exception ex)
{
labelResult.Text += TxtIPAddress.Text + " is Unavailable " + System.DateTime.Now.ToString() + " " + Environment.NewLine;
}
Another monitoring tool which is more complex is PRTG
It provides a number of monitors and is free for so many sensors you wish to monitor.
This way you can monitor not just a sites existence but whether a web service for a specific call returns, a SQL query. The possibilities are almost endless.

Create FTP mount using GIO library

I'm trying to use GIO. I figured out how to use GVolumeMonitor to catch volume changes and get list of volumes. g_volume_monitor_get_mounts function gives me a list of existing GMount's. Each of them can represent a HDD partition or a mount of network share (ftp, smb, sftp etc). Mounting a HDD partition seems to be possible using g_volume_mount. But how to create GMount representing a network share? Which classes are responsible for this?
Here is my code:
GVolumeMonitor* monitor = g_volume_monitor_get();
GList* list = g_volume_monitor_get_mounts(monitor);
for(; list; list = list->next) {
GMount* mount = static_cast<GMount*>(list->data);
GFile* file = g_mount_get_root(mount);
qDebug() << "Mount(" << g_mount_get_name(mount) <<
", " << g_file_get_path(file) << ")";
}
(I know there must be g_object_unref and g_list_free.)
Output:
Mount( SFTP for ri on host.org , /home/ri/.gvfs/SFTP for ri on host.org )
Mount( Yellow hard disk , /media/Yellow hard disk )
I was created the first sftp mount using nautilus. Now I want to implement this functionality myself. Target OS is Ubuntu 12.04.
I think you might be looking for g_file_mount_enclosing_volume()

Creating Database in Blackberry phone

please help me out from this
I have created a database in Blackberry phone i have also created a table inside the database but on clicking on the database on blackberry simulator it shows UNABLE TO DISPLAY file ,and the code I have written is
class CreateDatabaseSchemaScreen extends MainScreen{
Database d;
public CreateDatabaseSchemaScreen(){
try
{
URI myURI = URI.create("file:///SDCard/Databases/SQLite_Guide/" + "MyEncryptedDatabase.db");
DatabaseSecurityOptions dbso = new DatabaseSecurityOptions(true);
d = DatabaseFactory.create(myURI,dbso);
d= DatabaseFactory.open(myURI);
Statement s= d.createStatement("CREATE TABLE 'People' ( " +
"'Name' TEXT, " +
"'Age' INTEGER )" );
s.prepare();
s.execute();
s.close();
d.close();
}
catch ( Exception e )
{
System.out.println( e.getMessage() );
e.printStackTrace();
}
}
}
Database files can't be opened directly, you'll need a third party desktop tool that allows management of it. You can find a list of these tools here (generally they're free):
http://www.sqlite.org/cvstrac/wiki?p=ManagementTools
In your BB simulator settings, you will see an option that specifies where the SDCard data is located. Once you choose a management tool, open the database in that location, and you'll be able to see if the table has been created successfully.

Resources