I've been looking around for a long time how to get files from Basecamp and so far it seems like a 'mission impossible', but I wanted to ask here as well:
Is there any way to get files from Basecamp projects and, if there is one, how?
Thanks in advance.
Edited: I mean how to get the uploaded files. You can export all project data except the files you have uploaded there.
You can export everything from Basecamp using the following steps.
Log in using Chrome.
Copy Cookie header from a page document
request in Chrome Developer tools.
wget --mirror -e robots=no
--reject logout --no-cookies 'http://your-subdomain.basecamphq.com' --header <pasted-cookie-header>
Matt McClure's answer was spot on but a couple of things held me up;
To find the required cookie in Chrome Developer tools, click on Network icon on top and then Headers tab.
Copy the entire cookie section from the 'request headers' section, including the 'Cookie:' label
Paste the entire string with quotes where matt indicated <pasted-cookie-header> as follows;
wget --mirror -e robots=no --reject logout --no-cookies 'http://your-subdomain.basecamphq.com' --header 'Cookie: session_token=c6a1ea88a0187b88025e; transition_token=BAhbB2kDA0VjSXU6CVRpbWUNqB...'
(I've shortened the cookie string for clarity)
The BaseCamp API purports to offer FULL access including files.
Basecamp API documentation http://developer.37signals.com/basecamp/
List of wrappers/libraries http://developer.37signals.com/
If you have any knowledge of REST you will be able to pull any/all data out (modulo rate limiting) manually and then do whatever you like with it.
Incidentally, if you write a tool that lets me move a project from one account to another, I'll pay good money for that!
Both Matt and Peter already proposed using wget --mirror which I think it is the easiest solution out there. However, I was unable to properly copy the cookies from Google Chrome.
Instead I went on a slightly different direction and used the Chrome cookie.txt export extension to copy all cookies as plain text to a cookies.txt file.
My wget command then looked like this:
wget --mirror -e robots=no --reject logout 'http://yourdomain.basecamphq.com' --load-cookies cookies.txt
Additional note for Mac users: wget can be easily installed with homebrew:
brew install wget
basecamp offers you to export your projects in XML, HTML - and there is also a way to get it in PDF. this information could be found in the help/faq section of basecamp: http://basecamphq.com/help/general#exporting_data
more about the PDF export: http://37signals.blogs.com/products/2008/02/export-a-baseca.html
Use this tool for Windows.
In trial mode, this tool will allow three projects to be downloaded from base, will list all the projects on your base account after you login using your basecamp credentials.
Import you Basecamp Classic project to the new Basecamp
Export Data from new Basecamp
Wait
Get email that it's done and download the zip file
You can set up an integration between Basecamp and Dropbox to automatically transfer all your Basecamp attachments into a dedicated Dropbox folder:
http://blog.cloudwork.com/your-automatic-basecamp-dropbox-backup-step-by-step/
The integration is done by CloudWork, which has a free plan so if you don't think you'll be backing up more than 100 attachments a month, it can do it for free. Above that, there is a pricing plan.
If you have php installed on your machine, save this code to basecampfilevac.php:
<?
// make sure the folder of the script is writeable (0777)
ini_set('memory_limit', '-1');//keeps the script from timing out
function BasecampCall($endPoint, $usePrefix = true) {
// From: http://prattski.com/2008/10/22/basecamp-api-examples-using-php-and-curl-get/
$session = curl_init();
$basecampId = '[Your Basecamp Account Id Here]'; //this should be a number like 9999999, You can find it in the URL when you log into Basecamp.
$username = '[Your Basecamp Username Here]';
$password = '[Your Basecamp Password Here]';
$emailaddress = '[Your Basecamp Email Address Here]';
$basecampUrl = 'https://basecamp.com/' . $basecampId . '/api/v1/';
curl_setopt($session, CURLOPT_URL, ($usePrefix == true ? $basecampUrl : "") . $endPoint);
curl_setopt($session, CURLOPT_HTTPAUTH, CURLAUTH_BASIC);
curl_setopt($session, CURLOPT_HTTPGET, 1);
curl_setopt($session, CURLOPT_HEADER, false);
curl_setopt($session, CURLOPT_HTTPHEADER, array('Accept: application/json', 'Content-Type: application/json'));
curl_setopt($session, CURLOPT_RETURNTRANSFER, true);
curl_setopt($session, CURLOPT_USERAGENT, "MyApp (".$emailaddress.")");
curl_setopt($session,CURLOPT_USERPWD, $username . ":" . $password);
if(ereg("^(https)",$request)) curl_setopt($session,CURLOPT_SSL_VERIFYPEER,false);
$response = curl_exec($session);
curl_close($session);
if($usePrefix){
$r = json_decode($response);
} else {
$r = $response;
}
return $r;
}
$projects = BasecampCall('projects.json');
// For each project take name and id
foreach($projects as $proj) {
$pr = array(
"id" => (string)$proj->id,
"name" => (string)$proj->name
);
// Retrieve the attachments
echo "\nSaving attachments for project: " . $pr['name'] . "...\n";
#mkdir($pr['name']);
$filesArray = array();
$n = 1;
do {
$attachments = BasecampCall("projects/" . $proj->id . "/attachments.json?page=" . $n);
if(count($attachments) > 0) {
foreach($attachments as $attachment) {
$file = pathinfo($attachment->name);
#file_put_contents($pr['name'] . "/" . $file['filename'] . (in_array($file['filename'], $filesArray) ? "-" . rand() : "") . "." . $file['extension'], BasecampCall($attachment->{'url'}, false));
$filesArray[] = $file['filename'];
echo "Saving file " . $attachment->name . "...\n";
}
}
$n++;
} while(count($attachments) == 50);
}
?>
then update the following lines with the correct information:
$basecampId = '[Your Basecamp Account Id Here]'; //this should be a number like 9999999, You can find it in the URL when you log into Basecamp.
$username = '[Your Basecamp Username Here]';
$password = '[Your Basecamp Password Here]';
$emailaddress = '[Your Basecamp Email Address Here]';
then save and execute this command: php basecampfilevac.php
This is a modified script originally from Rettger Galactic
My solution to download the Basecamp export zip-file in bash on the server was quit easy:
Follow the instructions at https://3.basecamp-help.com/article/150-export-your-basecamp-data to start the export.
Wait for Basecamp email with link to the export page.
Open the link to the export download page while having your Browser dev tools open (I'm using Firefox).
Check any GET request in the dev tools and copy that request as cURL command
Paste the cURL command in the shell and change the URL to the download link from the 'Download my export' button
Add -L flag (follow redirects) as well as --output <your_file.zip> to the cURL command.
Execute cURL command and downlad the file ;)
Related
I just realised that I didn't back up my skype app data folder when formatting my PC. There is a certain person's chat history I am fond of which I have lost now. I know there are ways to inspect chat logs via database viewers (example: Is there a way to access Skype IM logs?) but that is not what I am after.
Is it possible for the person in question to share our chat history with me, in a way that I can insert it into my skype app data folder, so that I can restore our history in skype?
Any help appreciated.
Edit: with my limited database knowledge, I was pondering something like this: sort the other person's main.db file (messages) by contacts, filter out my name, extract them and somehow insert those records into my own main.db file. But then of course the records would be reversed.
The Skype DB file location depends on the OS and Skype version you use but you can always search for a part of the message that you know must be inside the DB file.
For example, I have Windows 10 and I knew that in DB there must be a string "all that we can use one fresh droplet" so I used TotalCommander ALT+F7 (below tick 'find text' and enter the string). But you can also use in Windows fileseek.ca or any other tool. In Linux grep -rnw '/' -e 'your string here' (and other OSs Google).
From here we get a lot of useful information like older Skype versions default location was: C:\Users\Admin\AppData\Roaming\Skype\yourname and my latest Skype version's DB file is here: C:\Users\MYNAME\AppData\Local\Packages\Microsoft.SkypeApp_kzf8qxf38zg5c\LocalState\s4l-myusername.db
Then get SQLite browser - for example sqlitebrowser.org and open that file. In older Skype you'll need to browse table "Messages" and in newer Skype table "messagesv12". You can just export the chats you need and import it to another DB file. Most of the useful data you need probably is in the second column 'nsp_data' in JSON format. You can type at the top of that column's filter area, for example, the Skype username you want the chat history. With this tool, you can export and also import to another DB file.
I personally prefer to get all the data to MySQL so that I have separate columns for date/time, author of the message, conversation partner and message content. In this answer, I got PHP code for newer Skype chat history but as I have also some older Skype db formats I wrote also a code to get data from there:
$db = new SQLite3('db/main.db');
$results = $db->query('SELECT id, chatname, author, from_dispname, timestamp, body_xml FROM Messages');
while ($row = $results->fetchArray()) {
if (strpos($row[5], $row[1]) !== false) $empty = 1; //In Skype there's many body XML's that just contain chat name. Maybe just adding contact or call?
else {
$empty = 0;
$datetime = date("Y-m-d H:i:s", $row[4]); //I need date/time instead timestamp
echo 'ID: ' . $row[0] . ', Chatname: ' . $row[1] . ', Author: ' . $row[2] . ', From displayname: ' . $row[3] . ', Date/time: ' . $datetime . ', Message: ' . $row[5] . '<br>---------<br>';
//var_dump($row);
}
}
I've just created a Barcode Scanner app and have a database of barcode. I want to upload/sync this database to the company's server then another programmer can get and build website UI. Unfortunately, our server is not public (but it can connect internet through proxy), so I want to use Dropbox to do that. Could you please give me a useful sample code or tell me the best way to upload/sync database in this case? I am extremely grateful for your help!
Alright, assuming your database is a MySQL DB with a host environment that lets you run cron jobs, access your FTP, etc... here's a possible code snippet for you, I just had to do this myself with the Dropbox API, you can actually read the full post here for a walk-thru (Dropbox API and MySQL DB Dump/Upload
<?php
# Include the Dropbox SDK libraries
require_once __DIR__."/dropbox-sdk/lib/Dropbox/autoload.php";
use \Dropbox as dbx;
//your access token from the Dropbox App Panel
$accessToken = 'NOT-A-REAL-TOKEN-REPLACE-THIS-QM8jS0z1w1t-REPLACE-THIS-TOKEN';
//run the MySQL dump and zip;
// location of your temp directory
$tmpDir = "your_temp_dir";
// username for MySQL
$user = "DB_user";
// password for MySQL
$password = "DB_password";
// database name to backup
$dbName = "DB_name";
// hostname or IP where database resides
$dbHost = "your_hostname";
// the zip file will have this prefix
$prefix = "sql_db_";
// Create the database backup file
$sqlFile = $tmpDir.$prefix.date('Y_m_d_h:i:s').".sql";
$backupFilename = $prefix.date('Y_m_d_h:i:s').".tgz";
$backupFile = $tmpDir.$backupFilename;
$createBackup = "mysqldump -h ".$dbHost." -u ".$user." --password='".$password."' ".$dbName." --> ".$sqlFile;
//echo $createBackup;
$createZip = "tar cvzf $backupFile $sqlFile";
//echo $createZip;
exec($createBackup);
exec($createZip);
//now run the DBox app info and set the client; we are naming the app folder SQL_Backup but CHANGE THAT TO YOUR ACTUAL APP FOLDER NAME;
$appInfo = dbx\AppInfo::loadFromJsonFile(__DIR__."/config.json");
$dbxClient = new dbx\Client($accessToken, "SQL_Backup");
//now the main handling of the zipped file upload;
//this message will send in a system e-mail from your cron job (assuming you set up cron to email you);
echo("Uploading $backupFilename to Dropbox\n");
//this is the actual Dropbox upload method;
$f = fopen($backupFile, "rb");
$result = $dbxClient->uploadFile('/SQL_Backup/'.$backupFilename, dbx\WriteMode::force(), $f);
fclose($f);
// Delete the temporary files
unlink($sqlFile);
unlink($backupFile);
?>
You also need to make a config.json file like so:
{
"key": "YOUR_KEY_FROM_DROPBOX_APP_PANEL",
"secret": "YOUR_SECRET_FROM_DROPBOX_APP_PANEL"
}
You will need to create a new Dropbox app under your Dropbox account to get your key and secret, and to generate the auth code for your username, do that here when logged in: https://www.dropbox.com/developers/apps
You also need to download the Dropbox PHP SDK library to put on your server in the same folder as this PHP code above, find that here: https://www.dropbox.com/developers/core/sdks/php
Hope this helps; if you need more step-by-step or your developer does, go that link at the top for a full walk through.
I am able to upload a document and download the document from google cloud storage for signed url using httpclient in java.But,when i put the same signed url in browser i am unable to download document for the link.I am getting following error
The request signature we calculated does not match the signature you
provided. Check your Google secret key and signing method.`
But when i mark check shared publicly check box in storage browser i am able to download from the generated signed url.But i want to allow a user to download a document from the browser without marking it as shared publicly.
.
I want to get confirm on some confusing part like
For document to get accessible by user who does not have google account after creating a signed url also i have to check shared publicly check box in storage browser?
But i think if the url is signed then it should not be check for shared publicly checkbox and user who does not have google account can access the document?But in my case it is not happening .According to link
https://developers.google.com/storage/docs/accesscontrol#About-CanonicalExtensionHeaders
it talks about Canonicalized_Extension_Headers .So i put in my request header
request.addHeader("x-goog-acl","public-read");
This is my code
// construct URL
String url = "https://storage.googleapis.com/" + bucket + filename +
"?GoogleAccessId=" + GOOGLE_ACCESS_ID +
"&Expires=" + expiration +
"&Signature=" + URLEncoder.encode(signature, "UTF-8");
System.out.println(url);
HttpClient client = new DefaultHttpClient();
HttpPut request = new HttpPut(url);
request.addHeader("Content-Type", contentType);
request.addHeader("x-goog-acl","public-read");// when i put this i get error
request.addHeader("Authorization","OAuth 1/zVNpoQNsOSxZKqOZgckhpQ");
request.setEntity(new ByteArrayEntity(data));
HttpResponse response = client.execute(request);
When i put request.addHeader("x-goog-acl","public-read");i get error
HTTP/1.1 403 Forbidden error .
.But when i remove this line it is uploaded successfully .It seems like i need to set
request.addHeader("x-goog-acl","public-read") to make it publicly accessible but on putting this on my code i am getting error.
.Any suggestion Please?
Finally Solved it.
To run singed url from browser you have to set HTTP header . In https://developers.google.com/storage/docs/accesscontrol#Construct-the-String
Content_Type Optional. If you provide this value the client (browser) must provide this HTTP header set to the same value.There is a word most.
So if you are providing Content_Type for sign string you must provide same Content_Type in browser http header.When i set Content_Type in browser header this error finally solved
this works for me:
set_include_path("../src/" . PATH_SEPARATOR . get_include_path());
require_once 'Google/Client.php';
function signed_storageURL($filename, $bucket, $p12_certificate_path, $access_id, $method = 'GET', $duration = 3600 )
{
$expires = time( ) + $duration*60;
$content_type = ($method == 'PUT') ? 'application/x-www-form-urlencoded' : '';
$to_sign = ($method."\n"."\n".$content_type."\n".$expires."\n".'/'.$bucket.'/'.$filename);
$signature = '';
$signer = new Google_Signer_P12(file_get_contents($p12_certificate_path), 'notasecret');
$signature = $signer->sign($to_sign);
$signature = urlencode( base64_encode( $signature ) );
return ('https://'.$bucket.'.commondatastorage.googleapis.com/'.$filename.'?GoogleAccessId='.$access_id.'&Expires='.$expires.'&Signature='.$signature);
}
$url = signed_storageURL(rawurlencode("áéíóú espaço & test - =.jpg"),'mybucket', 'mykey.p12','myaccount#developer.gserviceaccount.com');
echo ''.$url.'';
I want to send some image files via CakePHP mail.
Currently I am using $this->Email->attachments = array($Path.$fileName); for one file only
I want to send multiple files in one email.
It works just like described in the manual, simply add more paths to the array.
http://book.cakephp.org/1.3/view/1638/Attachments
$this->Email->attachments = array(
$Path . $fileName,
$Path . $someOtherFile
);
I have to reset my password direct through database for that I used query
UPDATE users SET pass = md5('NEWPASSWORD') WHERE name = 'admin'
but still I am not able to login.
Can you please tell me where I am going wrong?
With drupal 7, password are no more encrypted through md5.
There are several way to reset a password in drupal7.
Using drush :
drush upwd admin --password="newpassword"
Without drush, if you have a cli access to the server :
cd <drupal root directory>
php scripts/password-hash.sh 'myPassword'
Now copy the resultant hash and paste it into the query:
update users set name='admin', pass='pasted_big_hash_from_above' where uid=1;
If you are working on a remote environment on which you cannot connect, you can put this specified code in a file such as password.php such as this one:
<?php
if (isset($_GET['p'])) {
require_once dirname(__FILE__) . '/includes/bootstrap.inc';
require_once dirname(__FILE__) . '/includes/password.inc';
print _password_crypt('sha512', $_GET['p'], _password_generate_salt(DRUPAL_HASH_COUNT));
exit();
}
print "No password to hash.";
And then hit your site using: http://domain.tld/password.php?p=MyPassword. The hash will appear on your browser's tab.
Don't forget to remove it once you done it.
Are you locked out of your account? If you've got DB access then try clearing out the "flood" table.