Deleting and adding users in a custom role, DNN portal - dotnetnuke

I am on DNN 4.7.0. I would like to delete all users who are in one of my custom roles and add new ones. Is there a way I can do this without deleting and adding users one by one? Are these users saved somewhere in the database where I could just run delete and insert statements?
Your help will be highly appreciated.

The User/Role Relationship is in the UserRoles table. You could write SQL against that table to insert the new relationship, and remove the old.
On a side note: I would highly encourage you to check out a newer version of DNN 4.7 is rather dated, buggy, and has security issues.

Related

Create migration file while creating table in Laravel Voyager

I am using Voyager framework for Laravel. Wherenever I create table from Database Manager its creating table but its not creating any migration file. And its not good user interface who works on git repository(share the application's database schema). Everyone in the group has to create table in backend and have to work. This is not good.
but It creates table in database(phpmyadmin)
and we have option to create Model(while creating table)
Any solution? need quick response
Unfortunately, Laravel Voyager doesn't make migrations for user tables.
There are two workarounds.
Laravel Migrations Generator
Use this dev package to generate the migrations for the given tables. Availabe on GitHub at: https://github.com/Xethron/migrations-generator. View the documentation to see how to generate the migrations for the specific tables.
However, the collaborators will have to create the BREADs for them.
Copying the database
While sharing the database, the laravel voyager config tables that have all the changes and specs will be available on all the collaborator accounts.
Laravel voyager saves all its configurations in tables. Hence removing the need to generate migrations. Porting the whole DB works for most of my projects since I work on most apps alone.

Dacpac must not drop extra columns

I have happily been writing a product which uses a,
Sql Server Database Project and life has been good until we
discovered a problem in upgrades.
While we create tables, stored procedures and various other database artefacts,
once deployed at customers they can add their own columns to the tables created by our dacpac.
We are using DacFx for deployment (Microsoft.SqlServer.Dac) and also provide the raw dacpac for customers who insist deployment by their DBA's.
While the problem may still be present when using SSMS or similar tools,
I am certain that with the "right" code we should somehow be able to prevent this when deploying via code.
Has anyone had the same issues and possibly found a solution?
Update, add screenshot for deployment settings.
As can be seen in the image, the "Drop objects in target but not in project" setting is already turned off.
Love this statement "I have happily been writing a product which uses a, Sql Server Database Project and life has been good" ha ha!
You could write a deployment contributor that looks for new columns and remove the drop step from the process.
You can either write your own or I have one that should do it (http://agilesqlclub.codeplex.com/), if you use my one then this will probably work for you:
/p:AdditionalDeploymentContributorArguments="SqlPackageFilter=KeepType(.*Column.*)"
If you want to write your own then you can use mine as a guide (source is on codeplex) or see http://blogs.msdn.com/b/ssdt/archive/2013/12/23/dacfx-public-model-tutorial.asp specifically "Solution 2: Filtering at deployment time".
Ed
There is an option "DropObjectsNotInSource" if that is false then the columns will stay (but you will have to drop other objects specifically (say in post-deploy).
There are more options in latest build but don't think you can specify to keep columns only.

Is there a way to create a constraint that prevents updates in a SQL Server 2012 database table?

I am creating an application that will track hours for employees. Ideally, HR has asked that certain tables not be modified once data is commited. This is done easily enough from the front-end and stored procedures. However, it would be great to be able to prevent it from the server itself through constraints so that folks that have access to the back-end data can't change any values in the selected tables (unless they are sneaky enough to know how to disable the constraints).
If you trust your SQL Server admins then it’s possible. Have your admin to create users that don’t have datawriter permissions for those tables or schema.
So, application would write the data into database and users who have access to those tables would only be able to read the data.
If you don’t want admins to have the ability to modify data that’s not possible. There is no way to prevent it but there is a way to detect it if it happens. Check out this article for details on details how to this is done in third party application and see if it helps.
Use Server Side security roles to give only the HR Group data-write privileges.

How to configure database permissions for a Django app?

I'm looking for links, or an answer here, on to how to properly configure the database permissions to secure a Django app? To be clear, I'm looking specifically for material dealing with grants on the database, not permissions within the Django framework itself.
From the django docs:
https://docs.djangoproject.com/en/dev/topics/install/
If you plan to use Django’s manage.py syncdb command to automatically create database tables for your models (after first installing Django and creating a project), you’ll need to ensure that Django has permission to create and alter tables in the database you’re using; if you plan to manually create the tables, you can simply grant Django SELECT, INSERT, UPDATE and DELETE permissions. On some databases, Django will need ALTER TABLE privileges during syncdb but won’t issue ALTER TABLE statements on a table once syncdb has created it. After creating a database user with these permissions, you’ll specify the details in your project’s settings file, see DATABASES for details.
I've just tested initial setup with MySQL. For python manage.py migrate at least you need following grants for simple operation (if yo use db-preparation):
CREATE, ALTER, INDEX
SELECT, UPDATE, INSERT, DELETE
And, by the way - security matters. You can reduce attack impact by limiting your system exposure. In this case - you can restrict 'DROP' - which is fairly huge plus. If you leave some tricky hole with ability to SQL-inject - you probably reduce the damage. I will research in the future if it will not do any harm to remove DELETE keyword - that would limit potential threats as well. Just because we all leave bugs from time to time :)
I usually:
grant all privileges on my_db.* to my_user#localhost identified by 'my_user_pass'
grant all privileges on test_my_db.* to my_user#localhost identified by 'my_user_pass'
I suppose if there were a bug in django, you might be opening your database up to terrible things, but you'd have other problems if there were that big of a security hole in django.
django minimally needs select, insert, update, and delete, to operate. If you're using test or syncdb at all, you'll also need to be able to create tables, and indexes (and maybe the file permission for loading sql fixtures).
So, for a mysql db, I'd guess the optimal set of permissions might be select, insert, update, delete, create, index, and file. If you wanted to get real nitty-gritty, you could selectively grant these permissions as appropriate on the table level (rather than the db level).
Personally, I find grant all ... easier to type.
What's the purpose of configuring permissions on DB level? If your server is compromised then the attacker will be able to do anything with your database (because he has the login/pass) and permissons won't help. If your server is secured then permissions are useless.
Permissions can make sense if your DB server is available from the outer world, but it is not a good idea to do so.

Database sharing/versioning

I have a question but I'm not sure of the word to use.
My problem: I have an application using a database to stock information. The database can ben in access (local) or in a server (SQL Server or Oracle). We support these 3 kind of database. We want to give the possibility to the user to do what I think we can call versioning.
Let me explain : We have a database 1. This is the master. We want to be able to create a database 2 that will be the same thing as database 1 but we can give it to someone else.
They each work on each other side, adding, modifying and deleting records on this very complex database. After that, we want the database 1 to include the change from database 2, but with the possibility to dismiss some of the change.
For you information, ou application is already multiuser so why don't we just use this multi-user and forget about this versionning? It's because sometimes, we need to give a copy of the database to another company on another site and they can't connect on our server. They work on their side and then, we want to merge.
Is there anyone here with experience with this type of requirement? We have a lot of ideas but most of them require a LOT of work, massive modification to the database or to the existing queries.
This is a 2 millions and growing C++ app, so rewriting it is not possible!
Thanks for any ideas that you may give us!
J-F
The term you are looking for is Database Replication. You can google that to get more information about the topic (my personal experience is limited).
This was already done by ical (an old SunOS calendar app).
What you store/remember/transmit when the app makes the changes is not just the database contents, but the actual change log (e.g. "delete record with ID 1", "update record with ID 2 with these fields", "insert record with these fields")
That way you can apply these changes to master DB later on, AND to filter them before applying

Resources