Frappe ERPNext and quickbooks connector - quickbooks-online

I am a python developer. I need to build a connector that will sync customers between ERPNext and quickbooks(using quickbooks API). How can I achieve that? please refer me socs if available for such erpnext customization.

I can recommend you to below options:
1) Frappe built in REST API functionality is the easiest and most reliable way to integrate with different systems. You can prepare a script or a software which will transfer customer data by means of APIs each other.
For full-duplex sync:
step-1: Retrieve erpnext new customers (or updates) via its API
step-2: Post new customers to quickbooks database via its API function
step-3: Retrieve new quickbooks customers (or updates) via its API
step-4: Post new customers to erpnext database via its API function
You can run that script periodically in the OS scheduler.
2) You can create replication task alternatively if your quickbooks implementation use Mariadb or Mysql database.

Related

Can I connect with Redshfit from Angular for reporting?

I need to connect Angular with redshift for historical reporting. Can it be achieved, what are the prerequisites ?
This is possible in theory using the Redshift Data API, but you should consider whether you truly want your client machine to be writing and executing SQL commands directly to Redshift.
To allow this the following are true:
The client machine will send the SQL to be executed, a malicious actor could modify this so permissions would be important.
You would need to generate IAM credentials via a service like Cognito to directly interact with the API.
It would be more appropriate to create an API that can directly communicate with Redshift offering protection on the SQL that can be executed.
This could use API Gateway and Lambda to keep it simple, with your frontend calling this instead of directly writing the SQL.
More information is available in the Announcing Data API for Amazon Redshift post.

Snowflake → Zapier Integration

I'm using Zapier with Redshift to fetch data from custom queries and trigger a wide array of actions when new rows are detected from either a table or custom query, including sending emails through Gmail or Mailchimp, exporting data to Google Sheets, and more. Zapier's UI enables our non-technical product stakeholders to take over these workflows and customize them as needed. Zapier has several integrations built for Postgres, and since Redshift supports the Postgres protocol, these custom workflows can be easily built in Zapier.
I'm switching our data warehouse from Redshift to Snowflake and the final obstacle is moving these Zapier Integrations. Snowflake doesn't support the Postgres protocol so it cannot be used as a drop in replacement for these workflows. No other data source has all the information that we need for these workflows so connecting to an upstream datasource of Snowflake is not an option. Would appreciate guidance on alternatives I could pursue, including the following:
Moving these workflows into application code
Using a foreign data wrapper in Postgres for Snowflake to continue using the existing workflows from a dummy Postgres instance
Using custom-code blocks in Zapier instead of the Postgres integration
I'm not sure if Snowflake has an API that will allow you to do what you want, but you can create a private Zapier Integration that will have all the same features and permissions as a public integration, but you can customize it for your team.
There's info about that process here: https://platform.zapier.com/
You might find it easier to use a vendor solution like Census to forward rows as events to Zapier. Their free plan is pretty sizeable for getting started. More info here https://www.getcensus.com/integrations/zapier

Integration of Oracle DB or APEX with Abinitio

How to integrate Oracle DB/APEX with abinitio.
Scenario : I have a Abinitio code which generates the monthly report. So being database developer i wants to run same code which is in abinitio and generates monthly report. So i am looking for any connector between Abinitio and oracle DB / APEX.
To integrate Oracle with another database you have different options
JDBC (you have to develop this function in JAVA as ETL) (Use a software like Talend or another ETL)
Oracle Link Database (You have to ask your DBA if it's possible)
Expose the info in JSON and make a web service call (Develop a web service in java, node, python, as so on) and consume the json exposed.
If I were you, I would use the option number 3 because It's cleaner option.
If you need more info let me know asap.
Preliminary step: make sure that JDBC drivers is installer and accessible
create DBC file for specify connection to Oracle DBMS.
use m_db for generate DBC template and also for test connection from Shell or GDE

How to create a database on IBM Bluemix?

I have created an application on Bluemix. I need to copy my database on Bluemix that can be accessed from my adapter. Can anyone give me detailed steps on how to proceed?
First thing: if your database is reachable through the Internet and you only need to connect to it from the application, please note that a cf application on Bluemix can access the public network and so it is already able to connect to your DB in this scenario.
Assuming that you have a requirement for migrating the DB on Bluemix, you didn't specify which kind of database you want to migrate, here are the main (not all) possibilities you currently have:
RDBMS:
PostgreSQL by Compose (you need an account on compose.io)
SQL Database (DB2, only Premium plan available)
ClearDB (MySQL)
ElephantSQL (this is basically a PostgreSQL as a Service - that is you have to work on the db via API)
you could use the RDBS capability of dashDB
No-SQL:
Cloudant (documental)
Redis by Compose (ultra fast key-value db. You need an account on compose.io)
MongoDB by Compose (you need an account on compose.io)
IBM Graph (graph No-SQL db)
I suggest you to take a look at the Bluemix Catalog (subcategory Data and Analytics) and to refer to the Docs as well.
You can create dashDB service on your bluemix, and copy / upload your data to Bluemix dashDB database, using dashDB VCAP Credentials to connect to it from your adapter, or you can bind your dashDB service to you application on Bluemix.

Import WebTrends Data to SQL Server Using SSIS

I need to import all the webtrends data to my local sql server database. on top of that i need to build reports.
I am very new to Webtrends, so can some one suggest me how to import the WebTrends data to my local DB Server.
Just for context - I work at Webtrends.
My first questions is are you using the site software version or the ondemand SaaS version? It also depends on which version.
If you're using the latest version of software the best way to go about getting things from the Webtrends Analytics product data store is to use the ODBC connector that is available for the product. This you can download from the web UI and it installs with a single click. I can provide more documentation, just send me an e-mail and I'll send you the actual documentation file on this.
The other method, and this is currently in beta and available only to OnDemand customers is to use the web services (REST based) that we've created. Check out the documentation here: http://product.webtrends.com/dxapi/index.html

Resources