Active Directory Incident response Automation - active-directory

So was working on an incident response case in an Active Directory Environment
so wanted to run a search query on all devices in the active directory domain to search for a certain file using its mdsum hash,file-name or size.So i have a domain controller so is there any way i can
do it with the windows command-line or a GUI tool i can use to run search queries across multiple computers in an active directory environment or any script i can run to do searches on multiple devices all devices in the domain
And the steps i wanted to follow are the following(if you can help me with commands to di each of the following)
view all computers in the active directory(all groups)
Alter remote commands to do searches for a specific file pdf file using its hash or name to check which computers have the same type of file in their memory and etc
or any solution that i can use to tuckle this
Acessing Eventlogs,usb activity,and memory for all users automatically or any other Aprroach in a Active Directory

Related

GPO or scripting solution to setup site specific Downloads settings in Edge

I need to allow a specific site to automatically download multiple files without the prompt in Edge. While I can do this manually by going to edge://settings/content/automaticDownloads, i need to do this for All Users on a bunch of machines in a specific OU.
I couldn't find any GPO to do this. I was thinking if its possible by scripting the script could be included in logon scripts for the users'. However I couldn't find any scripting option either.
Any suggestions/ pointers / ideas?

is there any database work like NTFS based on RBAC?

If somebody want to manage permissions in a program first idea would be RBAC implementation. but another idea is a service like NTFS which allow to any user to create and manage his/her directories and sub directories. moreover can add more users with defined access

Creating A Log Of Files In A Folder and update into table

Can anyone help me to build a table that lists all files in a specified folder, so whenever a file is copied to that folder the table should update and make a log of files?
I need the list to retain the names, even if the file is moved from that folder or deleted. Later the data would be deleted by a scheduler.
Also I need the table to record the time exactly when the file was copied into that folder and not the modification or creation time.
I am using windows 7; how can I build a system with my desired behaviour?
Just turn on Windows file auditing, for that folder, the youtube video takes you through the process.
Microsoft provide information on their techNet site as to how you can use the LogParser tool to extract Security events from the Event Log DB.
Note: Admin questions should really be posted to the SuperUser site.

Informix-SQL (SE) on the cloud with WinTerm thin clients

I have several customers, each running a customized version of my ISQL (SE) desktop app. I would like to replace their desktop app with thin client WinTerm's, connecting to the cloud (My ISQL app on an SuSE Open or RedHat server). If this can be done, I would like for each customer to use one standardized version of my app to simplify updates and support. However, I'm not sure what's the best way to design the database. Should each customer have their own database.dbs (DBPATH=) or is there a better design?
You have two options, one of them that you've already considered:
Each customer has their own database with the standard name.
Each customer has their own database with a separate name for each.
The advantage of option 1 is that your existing code would work substantially unchanged; you'd simply have to ensure that the setting of DBPATH is correct for each customer. The downside is that you need to create a separate directory for each customer too. However, this has its merits; you can probably more easily keep Customer A from seeing any of the files generated by Customer B (and vice versa) if each customer has their own separate home directory, and their database is located in that directory.
The advantage of option 2 is that you can put all the customers' databases in a single directory. As mentioned, this is apt to make it easier for Customer A to see stuff that he should not see that belongs to Customer B. You would also have to ensure that you override the default database name every time you run a command - whether that's sperform or sacego or anything else.
Between the two, I would go with option 1 (separate databases with common name in different directories), with rigid walls between customers. Each customer would have their own user name and group, and the permissions on the directories would avoid public access of any sort. They can all still use a single INFORMIXDIR and your associated forms and reports.

What is the best deployment approach for WPF applications with local database?

I want to make a WPF application that exists in one directory including all files that it needs: .exe, .mdf database, .xml config files, etc.
the application should work no matter what directory it is in so that it supports this scenario:
person 1 executes the application in c:\temp\wpftool.exe
the application reads and writes to the c:\temp\wpftool.mdf database
person 1 zips up that directory and sends it to person 2 via e-mail
person 2 unzips it to c:\Users\jim\documents\checkout\wpftool.exe, the application reads and writes to the same database in that directory (c:\Users\jim\documents\checkout\wpftool.mdf)
person 2 zips the directory up again and sends it back to person 1 to continue making changes on it
What is the best way to create a WPF application that supports the above scenario?, considering:
there should be no hard-coded database connection strings
what is the best deployment method, click once? or just copy the .exe file out of the /release directory?
reasonable security so that users have to log in based on passwords in the database, and if a third person happens to intercept the e-mail, he could not easily look at the data in the database
Some points on the database side:
Assuming the "New user" already has SQL installed, they'd need to attach the (newly copied) database. Besides having sufficient access rights to attach a database, your application would need to configure the call to include the drive\folder containing the database files. If your .exe can identify it's "new home folder" on the fly, you should be able to work that out.
Define "reasonable security". Any database file I get, I can open, review, and ultimately figure out (depends on how obscure the contents are). Can you obfuscate your data, such as using table "A" instead of "Customer"? Would you really want to? The best possible security involves data encryption, and managing that--and in particular, the encryption keys--can be a pretty advanced subject, depending on just how "secure" you want your data to be.
For the database, I would look into using the "user instance" feature in SQL Express. Combined with the |DataDirectory| substitution string support it makes it very easy for your application to get hooked up.
In all honesty I have not deployed a ClickOnce app leveraging this approach myself yet, but I just thought I would bring it to your attention because it's what I would look into myself if I was building something like you described.

Resources