I've read questions on Stack Overflow very similar to this question, but not quite the same.
Let's say that I had the following config.inc.php file included on every page of my website:
<?php
$site_name = 'Acme Inc.';
$authenticate_with_ldap = true;
$ldap_host = 'ldap.example.com';
$ldap_port = 389;
$ldap_rdn = 'ldap-user';
$ldap_password = 'ldap-pass';
$ldap_dn = 'ou=example,dc=example,dc=com';
$smtp_username = 'smtp-user';
$smtp_password = 'smtp-pass';
$recaptcha_publickey = 'my-recaptcha-publickey';
$recaptcha_privatekey = 'my-recaptcha-privatekey';
?>
Note: I have chosen to keep the website configuration in a file instead of the database because the information is used all over the website and it would be a lot more code and, I'm guessing, a lot more overhead to have to query the database for the same information all the time.
Now let's say that the website administrator is the type of person who would prefer to edit the above information using a CMS as opposed to going in and editing the file manually. My fear is that when the website administrator clicks the "Update" button and the PHP script gets to the file_put_contents function that overwrites the config.inc.php file, something could go wrong and either corrupt the file or make it unusable due to a syntax error or something.
Is this a reasonable concern? Should I tell the website administrator that he should just tough it out and edit the file manually? Should I store the information in the database instead? Or should I store the information in both places so that if the file gets messed up, it can be regenerated using the information in the database?
If you store that info in the DB as a single row of data, wouldn't it be cached anyway?
Related
I am writing a WordPress plugin.
In one program, I capture the WordPress user id and write it to a file in a custom database.
Another program connects to the custom database, retrieves multiple rows having the user id:
$connection = new PDO("mysql:host=$dbhost;dbname=$dbname", $dbuser, $dbpass);
$sql = "SELECT ...";
$prep = $connection->prepare($sql);
and tries to access the wordpress function after retrieving each record:
$user_info = get_userdata($user_id);
As soon as the get_userdata function is executed, the programs dies.
Do I need to connect to the wordpress database?
If so, how?
First of all, Why you have made database connection manually this way as you can able to use global variable "global $wpdb;" and then able to write query based on "$wpdb".
Second thing, you might declare that $user_id variable as global, so can access it globally in file or you should define in functions.php file and check for their availability.
Please let me know if any of this solution does not work for you with details.
I'm using for a small website the pyrocms / codeigniter combo.
after adding some content, i checked the db and saw that:
is this a normal behaviour? multiple session_ids for one user with the same ip?
i can't imagine that this is correct.
my session config looks like:
$config['sess_cookie_name'] = 'pyrocms' . (ENVIRONMENT !== 'production' ? '_' .
ENVIRONMENT : '');
$config['sess_expiration'] = 14400;
$config['sess_expire_on_close'] = true;
$config['sess_encrypt_cookie'] = true;
$config['sess_use_database'] = true;
// don't change anything but the 'ci_sessions' part of this. The MSM depends on the 'default_' prefix
$config['sess_table_name'] = 'default_ci_sessions';
$config['sess_match_ip'] = true;
$config['sess_match_useragent'] = true;
$config['sess_time_to_update'] = 300;
i did not change on line of code affecting the session class or something like that.
the red hat rows belong to a 15min cron-job. this is fine i think.
everytime a refresh the page two or three new session_entries are added...
Yes, this is normal. The CI session class automatically generates a new ID periodically. (Every 5 minutes, by default.) This is part of the security inherent in using CI sessions instead of native PHP sessions. Garbage collection will take care of this, you do not need to do anything.
You can read more about the session id behavior in the CI manual. This is an excerpt copied from that page.
The user's unique Session ID (this is a statistically random string
with very strong entropy, hashed with MD5 for portability, and
regenerated (by default) every five minutes)
This behavior is by design. There is nothing to fix. The session class has built in garbage collection that deletes old entries as needed. I have many projects using code igniter for several years. This is what it does.
If it really bothers you, you can alter the timeout in the main CI config file. Change the line
$config['sess_time_to_update'] = 300 (the 5 minute refresh period)
to a number greater than
$config['sess_expiration'] (default 7200)
This will cause the session to timeout before it is regenerated. This is inherently less secure in theory, but unless you are transacting sensitive data, it is probably irrelevant in practice.
But again, this is by design as part of the many layers of CI sessions. These and other features are what make it better than PHP native sessions. You can turn on profiling and see that the overhead for these queries is negligible, especially in light of all the other optimizations the framework provides.
Okay, so I'm writing a Google Apps Script for our intranet, and I want to be able to display a list of files from a folder on Google Drive. However, I only want to display files that the user has access to.
There is a method, getViewers, that will return a list of strings:
https://developers.google.com/apps-script/class_file#getViewers
The problem with that is, although it returns email addresses for individuals who are on the permissions list, it returns group names. This is less than ideal, since there's no way to get the group object with GroupsManager -- it only takes the group ID.
There are a few things I could do in spite of this. One thing I tried was this:
var files = DocsList.getFolderById('0B_Zfq-SOMETHINGIJUSTMADEUP').getFiles();
for (f = 0; f < files.length; f++){
var viewers = files[f].getViewers();
var flag = false;
// userGroups is the list of group objects, from this session's user
for (i=0; i < usersGroups.length; i++){
var groupName = userGroups[i].getName();
if (viewers.indexOf(groupName) > -1){
flag = true;
}
}
if (flag){
// print the link to file within the HTML template
}
}
But that takes horribly long to load the page, for obvious reasons. It loads in like 5 minutes. What I really need is to be able to get a list of group email addresses from the getViewers method. It seems really strange that it returns emails for individual users, but group names for groups. Does anyone know any solution or workaround for this?
Your best bet will probably be to use a cache with the groupIds mapped to group name and then use that instead of the GroupManager service otherwise it will be an age every time the script runs. Depending on how 'live' the docs list is for the Intranet site, you could speed things up also using a cache for the directory map.
If the permissions of files and directory listing are VERY changeable then the cache could be pre-populated by a helper script running on a time-based-trigger to suit your needs.
This is a good suggestion to add on the issue tracker as a Feature Request.
I am using PowerBuilder PFC library to login to the database.
n_cst_appmanager/ pfc_open:
IF this.of_LogonDlg() > 0 THEN
Open(w_myapp_frame)
END IF
n_cst_appmanager/ pfc_logon:
SQLCA.DBMS = "ODBC"
SQLCA.AutoCommit = False
SQLCA.DBParm = "ConnectString='DSN=mytestdb;UID=" + as_userid + ";PWD=" + as_password + "'"
connect using SQLCA;
Now, once the user is logged in, there are few situations that I will need to connect to another database (for example, to copy some data there), so I would like to connect to the other database automatically, without displaying the login window again, therefore I would need to save the username and password of the user.
How can I save it? Do I need to save in the registry? Can you give some example please?
For example, I can get the user id in following way:
s_userid = gnv_app.of_GetUserID()
But I can not get the password. Can someone please help me how i can do it? Thanks a lot.
Actually, now that I'm paying attention to what you need instead of what you asked for <g>, and riffing off of Hugh's answer, why not just copy the transaction object?
n_cst_String lnv_String
ltr_NewConnect.DBMS = SQLCA.DBMS
ltr_NewConnect.AutoCommit = SQLCA.AutoCommit
ltr_NewConnect.DBParm = lnv_String.of_GlobalReplace (SQLCA.DBParm, "mytestdb", "myotherdb")
If I were doing this, I'd code a copy of all the transaction object fields, just in case the means of defining the connection changes.
I'm assuming the other database is the same type of database in order for this to make sense (so that it uses the same type of DBParm), but either way the principle may apply.
Good luck,
Terry.
There's nothing built into PFC and there's nothing automagic in PowerBuilder that will help you with this. Just create an instance variable and a function to access it. Maybe grab the n_cst_LogonAttrib from the Message.PowerObjectParm immediately after the call to of_LogonDlg() and grab the value from there. Or, further extend your n_cst_AppManager.pfc_Logon event. Or extend of_LogonDlg(), and model the capture after the way PFC does the user id.
Note that storing the password anywhere permanent and visible to other processes like the registry would be a security violation that many companies would not allow. Not a direction you want to go.
Good luck,
Terry.
You can parse them out of SQLCA.DBParm.
string ls_userID, ls_password
n_cst_string stringSrv
ls_userID = stringSrv.of_getKeyValue(SQLCA.DBParm, "UID", ";")
ls_password = stringSrv.of_getKeyValue(SQLCA.DBParm, "PWD", ";")
However, a good case can be made for capturing them in the appmanager if you know you will need them.
Having the same login credentials for different databases is a security concern. It's the sort of thing that leads to your company being in the news for the wrong reasons.
Greetings,
Well I am bewildered. I have been tasked with updating a PHP script that uses the BulkAPI to upsert some data into the Opportunity entity.
This is all going well except that the Bulk API is returning this error for some clearly defined custom fields:
InvalidBatch : Field name not found : cv__Acknowledged__c
And similar.
I thought I finally found the problem when I discovered the WSDL version I was using was quite old (Partner WSDL). So I promptly regenerated the WSDL. Only problem? Enterprise, Partner, etc....all of them...do not include these fields. They're all coming from the Common Ground package and start with cv_
I even tried to find them in the object explorer in Workbench as well as the schema explorer in Force.com IDE.
So, please...lend me your experience. How can I update these values?
Thanks in advance!
Clif
Screenshots to prove I have the correct access:
EDIT -- Here is my code:
require_once 'soapclient/SforcePartnerClient.php';
require_once 'BulkApiClient.php';
$mySforceConnection = new SforcePartnerClient();
$mySoapClient = $mySforceConnection->createConnection(APP.'plugins'.DS.'salesforce_bulk_api_client'.DS.'vendors'.DS.'soapclient'.DS.'partner.wsdl.xml');
$mylogin = $mySforceConnection->login('redacted#redacted.com', 'redactedSessionredactedPassword');
$myBulkApiConnection = new BulkApiClient($mylogin->serverUrl, $mylogin->sessionId);
$job = new JobInfo();
$job->setObject('Opportunity');
$job->setOpertion('upsert');
$job->setContentType('CSV');
$job->setConcurrencyMode('Parallel');
$job->setExternalIdFieldName('Id');
$job = $myBulkApiConnection->createJob($job);
$batch = $myBulkApiConnection->createBatch($job, $insert);
$myBulkApiConnection->updateJobState($job->getId(), 'Closed');
$times = 1;
while($batch->getState() == 'Queued' || $batch->getState() == 'InProgress')
{
$batch = $myBulkApiConnection->getBatchInfo($job->getId(), $batch->getId());
sleep(pow(1.5, $times++));
}
$batchResults = $myBulkApiConnection->getBatchResults($job->getId(), $batch->getId());
echo "Number of records processed: " . $batch->getNumberRecordsProcessed() . "\n";
echo "Number of records failed: " . $batch->getNumberRecordsFailed() . "\n";
echo "stateMessage: " . $batch->getStateMessage() . "\n";
if($batch->getNumberRecordsFailed() > 0 || $batch->getNumberRecordsFailed() == $batch->getNumberRecordsProcessed())
{
echo "Failures detected. Batch results:\n".$batchResults."\nEnd batch.\n";
}
And lastly, an example of the CSV data being sent:
"Id","AccountId","Amount","CampaignId","CloseDate","Name","OwnerId","RecordTypeId","StageName","Type","cv__Acknowledged__c","cv__Payment_Type__c","ER_Acknowledgment_Type__c"
"#N/A","0018000000nH16fAAC","100.00","70180000000nktJ","2010-10-29","Gary Smith $100.00 Single Donation 10/29/2010","00580000001jWnq","01280000000F7c7AAC","Received","Individual Gift","Not Acknowledged","Credit Card","Email"
"#N/A","0018000000nH1JtAAK","30.00","70180000000nktJ","2010-12-20","Lisa Smith $30.00 Single Donation 12/20/2010","00580000001jWnq","01280000000F7c7AAC","Received","Individual Gift","Not Acknowledged","Credit Card","Email"
After 2 weeks, 4 cases, dozens of e-mails and phone calls, 3 bulletin board posts, and 1 Stackoverflow question, I finally got a solution.
The problem was quite simple in the end. (which makes all of that all the more frustrating)
As stated, the custom fields I was trying to update live in the Convio Common Ground package. Apparently our install has 2 licenses for this package. None of the licenses were assigned to my user account.
It isn't clear what is really gained/lost by not having the license other than API access. As the rest of this thread demonstrates, I was able to see and update the fields in every other way.
If you run into this, you can view the licenses on the Manage Packages page in Setup. Drill through to the package in question and it should list the users who are licensed to use it.
Thanks to SimonF's professional and timely assistance on the Developer Force bulletin boards:
http://boards.developerforce.com/t5/Perl-PHP-Python-Ruby-Development/Bulk-API-So-frustrated/m-p/232473/highlight/false#M4713
I really think this is a field level security issue. Is the field included in the opportunity layout for that user profile? Field level security picks the most restrictive option, so if you seem to have access from the setup screen but it's not included in the layout, I don't think the system will give you access.
If you're certain that your user's profile has FLS access to the fields and the assigned layouts include the fields, then I'd suggest looking into the definition of the package in question. I know the bulk API allows use of fields in managed packages normally (I've done this).
My best guess at this point is that your org has installed multiple versions of this package over time. Through component deprecation, it's possible the package author deprecated these custom fields. Take a look at two places once you've logged into salesforce:
1.) The package definition page. It should have details about what package version was used when the package was first installed and what package version you're at now.
2.) The page that has WSDL generation links. If you choose to generate the enterprise WSDL, you should be taken to a page that has dropdown elements that let you select which package version to use. Try fiddling with those to see if you can get the fields to show up.
These are just guesses. If you find more info, let me know, and I can try to provide additional guidance.