Connect online quickbook to my php website - quickbooks-online

I am in need of generating invoices on my online quickbook account automatically through my php website.
i searched a lot over intuit and other sources but dont know where to start or what to do.
please help me..
Thanks

For non-SaaS QuickBooks integration with QuickBooks Online, you want to use qbXML.
To do this, you first need to register with Intuit. It's easiest to register in DESKTOP mode. There are instructions to do this on our QuickBooks PHP wiki. Register as a PRODUCTION application (you can only use DEV/PTC if you get a special account from Intuit, which you won't be able to get).
Once you've registered, you'll have a connection ticket, an app id, and a app login.
Then, you can grab our open source PHP QuickBooks DevKit (use a recent nightly build) and open this example:
docs/example_online_edition.php
From there, you plug in your app id, app login, and connection ticket, and you'll be able to send XML formatted requests to QuickBooks.
You'll want to use Intuits QuickBooks OSR for XML reference. Make sure to check "OE", uncheck "US", change the qbXML version to 6.0 (QuickBooks Online only supports 6.0), use the "Select Message" drop-down to choose the request type, and the "XML Ops" tab to see the available XML fields.
Your resulting code will look something like:
require_once dirname(__FILE__) . '/../QuickBooks.php';
// Register in DESKTOP mode to get these. Docs:
// http://www.consolibyte.com/docs/index.php/QuickBooks_Online_via_qbXML#Connecting_with_the_.27Desktop.27_model_of_communication
$application_id = '134476443';
$application_login = 'qboe.www.consolibyte.com';
$connection_ticket = 'TGT-68-1sRm2nXMVfm$n8hb2MZfVQ';
// Create our new gateway instance
$Gateway = new QuickBooks_Gateway_OnlineEdition(
$application_id,
$application_login,
$connection_ticket);
$xml = '<QBXMLMsgsRq onError="stopOnError">
<VendorAddRq>
<VendorAdd>
<Name>ConsoliBYTE</Name>
<FirstName>Keith</FirstName>
<LastName>Palmer</LastName>
<VendorAddress>
<Addr1>123 Test Road</Addr1>
<City>Mt Pleasant</City>
<State>MI</State>
<PostalCode>48858</PostalCode>
</VendorAddress>
<Email>support#consolibyte.com</Email>
</VendorAdd>
</VendorAddRq>
</QBXMLMsgsRq>';
// Send the request
$resp = $Gateway->qbxml($xml);
print($resp);

Related

Use path/slug after Web App's base url in Google Apps Script

I'm looking to make the url by adding a path which is something like this below in Google Apps Script:
https://script.google.com/macros/s/APP_ID/exec/fileName.txt
How can I achieve this for Web App service?
I believe your goal as follows.
You want to access to Web Apps using the URL of https://script.google.com/macros/s/APP_ID/exec/fileName.txt.
For this, how about this answer? I think that you can achieve your goal using Web Apps. As a sample case, I would like to explain about this using a sample script for downloading a text file, when an user accesses to https://script.google.com/macros/s/APP_ID/exec/fileName.txt.
Usage:
Please do the following flow.
1. Create new project of Google Apps Script.
Sample script of Web Apps is a Google Apps Script. So please create a project of Google Apps Script.
If you want to directly create it, please access to https://script.new/. In this case, if you are not logged in Google, the log in screen is opened. So please log in to Google. By this, the script editor of Google Apps Script is opened.
2. Prepare script.
Please copy and paste the following script (Google Apps Script) to the script editor. This script is for the Web Apps.
function doGet(e) {
const path = e.pathInfo;
if (path == "filename.txt") {
const sampleTextData = "sample";
return ContentService.createTextOutput(sampleTextData).downloadAsFile(path);
}
return ContentService.createTextOutput("Wrong path.");
}
In order to retrieve the value of fileName.txt in https://script.google.com/macros/s/APP_ID/exec/fileName.txt, please use pathInfo.
For example, when you check e of doGet(e) by accessing with https://script.google.com/macros/s/APP_ID/exec/fileName.txt, you can retrieve {"contextPath":"","contentLength":-1,"parameter":{},"parameters":{},"queryString":"","pathInfo":"fileName.txt"}.
In this case, the GET method is used.
3. Deploy Web Apps.
On the script editor, Open a dialog box by "Publish" -> "Deploy as web app".
Select "Me" for "Execute the app as:".
By this, the script is run as the owner.
Select "Anyone, even anonymous" for "Who has access to the app:".
In this case, no access token is required to be request. I think that I recommend this setting for your goal.
Of course, you can also use the access token. At that time, please set this to "Anyone". And please include the scope of https://www.googleapis.com/auth/drive.readonly and https://www.googleapis.com/auth/drive to the access token. These scopes are required to access to Web Apps.
Click "Deploy" button as new "Project version".
Automatically open a dialog box of "Authorization required".
Click "Review Permissions".
Select own account.
Click "Advanced" at "This app isn't verified".
Click "Go to ### project name ###(unsafe)"
Click "Allow" button.
Click "OK".
Copy the URL of Web Apps. It's like https://script.google.com/macros/s/###/exec.
When you modified the Google Apps Script, please redeploy as new version. By this, the modified script is reflected to Web Apps. Please be careful this.
4. Run the function using Web Apps.
Please access to https://script.google.com/macros/s/###/exec/filename.txt using your browser. By this, a text file is downloaded.
Note:
When you modified the script of Web Apps, please redeploy the Web Apps as new version. By this, the latest script is reflected to the Web Apps. Please be careful this.
References:
Web Apps
Taking advantage of Web Apps with Google Apps Script
Updated on February 14, 2023
In the current stage, it seems that pathInfo can be used with the access token. It supposes that the following sample script is used.
function doGet(e) {
return ContentService.createTextOutput(JSON.stringify(e));
}
When you log in to your Google account and you access https://script.google.com/macros/s/###/exec/sample.txt with your browser, {"contextPath":"","parameter":{},"pathInfo":"sample.txt","contentLength":-1,"parameters":{},"queryString":""} can be seen.
In this case, when you access it without logging in Google account, even when Web Apps is deployed as Execute as: Me and Who has access to the app: Anyone, the log in screen is opened. Please be careful about this.
And, if you want to access with https://script.google.com/macros/s/###/exec/sample.txt using a script, please request it by including the access token. The sample curl command is as follows. In this case, the access token can be used as the query parameter. Please include one of the scopes of Drive API in the access token.
curl -L "https://script.google.com/macros/s/###/exec/sample.txt?access_token=###"
By this, the following result is returned.
{"contextPath":"","queryString":"access_token=###"},"pathInfo":"sample.txt","parameters":{"access_token":["###"]},"contentLength":-1}

Email does not send in Laravel API after writing to database

I have an AngularJS 1.5 application which is working with a Laravel 5.2 API and I'm trying to send emails at different points in the application. So I'm able to send data to Laravel and it gets recorded in the tables I specify but when it gets to sending a confirmation email it gives me this error with an HTTP status code of 500: MethodNotAllowedHttpException
Odd thing is, it works perfectly fine in local development on my laptop. But the same functions on the AWS EC2 instance and it fails when it gets to sending any email. I'm using SendGrid to manage sending emails but I don't think I need to change any settings for that.
For Example:
$emailUser = array();
$emailUser['email'] = $request->email;
$emailUser['first_name'] = $request->first_name;
$emailUser['last_name'] = $request->last_name;
$emailUser['randomStr'] = str_random(36);
$emailUser['remove_dtm'] = Carbon::now()->addWeeks(2);
//Add a password reset set to 2 weeks out for the user to register
DB::table('password_resets')->insert([
'email' => $emailUser['email'],
'token' => $emailUser['randomStr'],
'remove_dtm' => $emailUser['remove_dtm']
]);
Mail::send('email.registered_user', $emailUser, function($message) use ($emailUser)
{ $message->to($emailUser['email'], $emailUser['first_name'] . ' ' . $emailUser['last_name']);
$message->from('WSCUSTOMERPO#waterstoneco.com', 'Waterstone Faucets');
$message->replyTo('WSCUSTOMERPO#waterstoneco.com', 'Waterstone Faucets');
$message->subject("Welcome to the Waterstone Faucets Portal!");
});
When I try to reset a user's password it will create the record in the password_reset table but not send the email on the live site. Again the same function works fine on my laptop. I checked that I'm posting on the Angular side and Laravel API is expecting a post HTTP call when running this function.
What am I missing here?
Thank you greatly for your help!
There are a few things to check here,
1: Are you sure you have your .env file set up to use the correct SMTP server settings to use SendGrid. If you forgot to set this up in your .env you will be using the internal mail function. Instead of using SendGrid, I would suggest keeping it inside of Amazon for more reliability. Switching over to Amazon SES may be a great option for you.
2: If you are using the internal mail system, there is a really good article about mail from Amazon EC2 instances here: http://shlomoswidler.com/2009/07/sending-email-from-ec2.html
Just a reminder for number 1 for others that may have come here looking for help. To set your mail service in Laravel to use an smtp service, open your config/mail.php file and set the driver to use your provider (if provided by laravel). This can be done by edit the file directly or setting the environment variable MAIL_DRIVER in your .env file.

google apps from app engine

I want to produce a Google Apps document based on a (Google doc) template stored on the users Google Drive and some XML data held by a servlet running on Google App Engine.
Preferably I want to run as much as possible on the GAE. Is it possible to run Apps Service APIs on GAE or download/manipulate Google doc on GAE? I have not been able to find anything suitable
One alternative is obviously to implement the merge functionality using an Apps Script transferring the XML as parameters and initiate the script through http from GAE, but it just seem somewhat awkward in comparison.
EDIT:
Specifically I am looking for the replaceText script functionality, as shown in the Apps script snippet below, to be implemented in GAE. Remaining code is supported through Drive/Mail API, I guess..
// Get document template, copy it as a new temp doc, and save the Doc’s id
var copyId = DocsList.getFileById(providedTemplateId)
.makeCopy('My-title')
.getId();
var copyDoc = DocumentApp.openById(copyId);
var copyBody = copyDoc.getActiveSection();
// Replace place holder keys,
copyBody.replaceText("CustomerAddressee", fullName);
var todaysDate = Utilities.formatDate(new Date(), "GMT+2", "dd/MM-yyyy");
copyBody.replaceText("DateToday", todaysDate);
// Save and close the temporary document
copyDoc.saveAndClose();
// Convert temporary document to PDF by using the getAs blob conversion
var pdf = DocsList.getFileById(copyId).getAs("application/pdf");
// Attach PDF and send the email
MailApp.sendEmail({
to: email_address,
subject: "Proposal",
htmlBody: "Hi,<br><br>Here is my file :)<br>Enjoy!<br><br>Regards Tony",
attachments: pdf});
As you already found out, apps script is currently the only one that can access an api to modify google docs. All other ways cannot do it unless you export to another format (like pdf or .doc) then use libraries that can modify those, then reupload the new file asking to convert to a google doc native format, which in some cases would loose some format/comments/named ranges and other google doc features. So like you said, if you must use the google docs api you must call apps script (as a content service). Also note that the sample apps script code you show is old and uses the deptecated docsList so you need to port it to the Drive api.
Apps script pretty much piggy backs on top of the standard published Google APIs. Increasingly the behaviours are becoming more familiar.
Obviously apps script is js based and gae not. All the APIs apart from those related to script running are available in the standard gae client runtimes.
No code to check here so I'm afraid generic answer is all I have.
I see now it can be solved by using the Google Drive API to export (download) the Google Apps Doc file as PDF (or other formats) to GAE, and do simple replace-text editing using e.g. the iText library

How to use Bedework server as a service for another system

For my application I need to use an open source calendar server. After some research I selected Bedework Server for my task. Basically what I want is to use this server to handle my application's calendar events. Even though I have setup a local server using quick start package, I kinda still confused on how I can use this. I can create events using it's web UI. But I want to use this as a service from my server (Something like a REST service). I read their documentation but I could not find anything that will help. I am really grateful if you can help me on this. Thanks in advance.
You can access the server using the CalDAV protocol. This is a standard REST protocol which specifies how you create/query/delete events and todos. It is the same protocol the Calendar or Reminders apps on OS X and iOS use to talk to the server.
The CalConnect CalDAV website is a good entry point to learn more about this.
If you are still looking this, you can try using any CalDAV Client Libraries -
CalDAV-libraries
I tried CalDAV4j library. For all basic use cases, it works fine.
There is also a demo github project on this library developed to list down the events in the server -
list-events-caldav4j-example
You can make use of the ListCalendarTest.java in the project and give appropriate endpoints to the Host configuration. For Example (for Bedework) -
HttpClient httpClient = new HttpClient();
// I tried it with zimbra - but I had no luck using google calendar
httpClient.getHostConfiguration().setHost("localhost", 8080, "http");
String username = "vbede";
UsernamePasswordCredentials httpCredentials = new UsernamePasswordCredentials(username, "bedework");
...
...
CalDAVCollection collection = new CalDAVCollection("/ucaldav/user/" + username + "/calendar",
(HostConfiguration) httpClient.getHostConfiguration().clone(), new CalDAV4JMethodFactory(),
CalDAVConstants.PROC_ID_DEFAULT);
...
...
GenerateQuery gq = new GenerateQuery();
// TODO you might want to adjust the date
gq.setFilter("VEVENT [20131001T000000Z;20131010T000000Z] : STATUS!=CANCELLED");
CalendarQuery calendarQuery = gq.generate();

How to automate download of weekly export service files

In SalesForce you can schedule up to weekly "backups"/dumps of your data here: Setup > Administration Setup > Data Management > Data Export
If you have a large Salesforce database there can be a significant number of files to be downloading by hand.
Does anyone have a best practice, tool, batch file, or trick to automate this process or make it a little less manual?
Last time I checked, there was no way to access the backup file status (or actual files) over the API. I suspect they have made this process difficult to automate by design.
I use the Salesforce scheduler to prepare the files on a weekly basis, then I have a scheduled task that runs on a local server which downloads the files. Assuming you have the ability to automate/script some web requests, here are some steps you can use to download the files:
Get an active salesforce session ID/token
enterprise API - login() SOAP method
Get your organization ID ("org ID")
Setup > Company Profile > Company Information OR
use the enterprise API getUserInfo() SOAP call to retrieve your org ID
Send an HTTP GET request to https://{your sf.com instance}.salesforce.com/ui/setup/export/DataExportPage/d?setupid=DataManagementExport
Set the request cookie as follows:
oid={your org ID}; sid={your
session ID};
Parse the resulting HTML for instances of <a href="/servlet/servlet.OrgExport?fileName=
(The filename begins after fileName=)
Plug the file names into this URL to download (and save):
https://{your sf.com instance}.salesforce.com/servlet/servlet.OrgExport?fileName={filename}
Use the same cookie as in step 3 when downloading the files
This is by no means a best practice, but it gets the job done. It should go without saying that if they change the layout of the page in question, this probably won't work any more. Hope this helps.
A script to download the SalesForce backup files is available at https://github.com/carojkov/salesforce-export-downloader/
It's written in Ruby and can be run on any platform. Supplied configuration file provides fields for your username, password and download location.
With little configuration you can get your downloads going. The script sends email notifications on completion or failure.
It's simple enough to figure out the sequence of steps needed to write your own program if Ruby solution does not work for you.
I'm Naomi, CMO and co-founder of cloudHQ, so I feel like this is a question I should probably answer. :-)
cloudHQ is a SaaS service that syncs your cloud. In your case, you'd never need to upload your reports as a data export from Salesforce, but you'll just always have them backed up in a folder labeled "Salesforce Reports" in whichever service you synchronized Salesforce with like: Dropbox, Google Drive, Box, Egnyte, Sharepoint, etc.
The service is not free, but there's a free 15 day trial. To date, there's no other service that actually syncs your Salesforce reports with other cloud storage companies in real-time.
Here's where you can try it out: https://cloudhq.net/salesforce
I hope this helps you!
Cheers,
Naomi
Be careful that you know what you're getting in the back-up file. The backup is a zip of 65 different CSV files. It's raw data, outside of the Salesforce UI cannot be used very easily.
Our company makes the free DataExportConsole command line tool to fully automate the process. You do the following:
Automate the weekly Data Export with the Salesforce scheduler
Use the Windows Task Scheduler to run the FuseIT.SFDC.DataExportConsole.exe file with the right parameters.
I recently wrote a small PHP utility that uses the Bulk API to download a copy of sObjects you define via a json config file.
It's pretty basic but can easily be expanded to suit your needs.
Force.com Replicator on github.
Adding a Python3.6 solution. Should work (I haven't tested it though). Make sure the packages (requests, BeautifulSoup and simple_salesforce) are installed.
import os
import zipfile
import requests
import subprocess
from datetime import datetime
from bs4 import BeautifulSoup as BS
from simple_salesforce import Salesforce
def login_to_salesforce():
sf = Salesforce(
username=os.environ.get('SALESFORCE_USERNAME'),
password=os.environ.get('SALESFORCE_PASSWORD'),
security_token=os.environ.get('SALESFORCE_SECURITY_TOKEN')
)
return sf
org_id = "SALESFORCE_ORG_ID" # canbe found in salesforce-> company profile
export_page_url = "https://XXXX.my.salesforce.com/ui/setup/export/DataExportPage/d?setupid=DataManagementExport"
sf = login_to_salesforce()
cookie = {'oid': org_id, 'sid':sf.session_id}
export_page = requests.get(export_page_url, cookies=cookie)
export_page = export_page.content.decode()
links = []
parsed_page = BS(export_page)
_path_to_exports = "/servlet/servlet.OrgExport?fileName="
for link in parsed_page.findAll('a'):
href = link.get('href')
if href is not None:
if href.startswith(_path_to_exports):
links.append(href)
print(links)
if len(links) == 0:
print("No export files found")
exit(0)
today = datetime.today().strftime("%Y_%m_%d")
download_location = os.path.join(".", "tmp", today)
os.makedirs(download_location, exist_ok=True)
baseurl = "https://zageno.my.salesforce.com"
for link in links:
filename = baseurl + link
downloadfile = requests.get(filename, cookies=cookie, stream=True) # make stream=True if RAM consumption is high
with open(os.path.join(download_location, downloadfile.headers['Content-Disposition'].split("filename=")[1]), 'wb') as f:
for chunk in downloadfile.iter_content(chunk_size=100*1024*1024): # 50Mbs ??
if chunk:
f.write(chunk)
I have added a feature in my app to automatically backup the weekly/monthly csv files to S3 bucket, https://app.salesforce-compare.com/
Create a connection provider (currently only AWS S3 is supported) and link it to a SF connection (needs to be created as well).
On the main page you can monitor the progress of the scheduled job and access the files in the bucket
More info: https://salesforce-compare.com/release-notes/

Resources