Need help stringing together database processes - database

I need some help from those with more knowledge than I posses. I am currently trying to figure out how to get real time data from a database.
I need to be able to find the company info from the most recent licensees. So the search parameter I'm using is 2016-05-10T00:00:00.000
The full string together from the API and the search parameter can be found directly at this link:
https://www.hurl.it/?method=GET&url=https%3A%2F%2Fdata.wa.gov%2Fresource%2Fv8vv-gqqs.json&headers=%7B%22X-App-Token%22%3A[%22bjp8KrRvAPtuf809u1UXnI0Z8%22]%7D&args=%7B%22licenseeffectivedate%22%3A[%222004-07-14T00%3A00%3A00.000%22]%7D
So I'm looking to retrieve the most recently added accounts in order to verify 1. the license is active 2. the license number the contractor gives matches what the website says. I would like to figure out how to automate this so that when the newest licenses are added I'll know, and they will be extracted/downloaded into excel.
If anyone can help with this I would appreciate it very much. I also have more questions about using databases if any of you are experts in the field.
Once again, thank you!
Clay

Since your goal is to get this data into Excell, have you considered using something like our OData support instead? You could structure your query in Excel PowerBI and it'd automatically refresh the data.
Another option would be to use our CSV output type with an Excel web query. I use the IMPORTDATA(...) function in Google Sheets, which is very similar.

Related

Quick Search do not retrieve "can not find" recently added records

I am working on CRM dynamics 2015, recently we became unable to find contacts that are created recently on CRM via "Quick Search", but we manage to find them with "Advanced search".
in the other hand the contacts that existed already existed are still searchable via Quick Find view
The Quick Search view is well configured.
What may be the problem?
Thank you in advance.
This issue is likely a metadata problem with the underlying tables/fields which store the name of the record. A workflow variant of this issue is described by the following article:
http://teameasi.com/blog/crm-quick-search-not-returning-records-that-exist
I have experienced this issue (specifically on the workflow table) and the fix suggested by the author if this article (renaming the workflow) does work for me. I have also experienced similar issues in the past with contact records modified directly via SQL where the fullname field was not set correctly to reflect the underlying name fields.
Obviously this renaming approach is not particularly friendly for fixing a large number of records, I have some ideas for how to fix records in bulk which I will need to prove out.
Can you provide the following feedback:
- Try re-naming one of the affected contacts to see if this approach also works for the contact case?
- What version of CRM are you on, is it on-premise or online?
- Please test out whether you receive different results within the advanced find if you search based on the fullname field vs the firstname and lastname fields

users and expiration date

I have a question that I hope someone can help me with, I would like to be able to search for how many paying members my website has at a specific date.
They belong to their own role "members" and there is an expiration date for each member. If they do not have an expiration date, they should also be listed.
Should I be able to use 2sxc module for this and would anyone tell me how to do?
You would need to have an intermediate understanding of SQL and the tables/fields related to the results you are trying to achieve. Though it is possible to get done in 2sxc, I would recommend starting the effort the DNN Reports module. That should let you focus on getting the results you need in the SQL Query first (since the display part is auto/default. Then, once you have the right query, you could move it over to 2sxc (or any module that allows data to be queried and returned as a result (set)) and do something more useful.

Parse Database PDF Export

I was wondering if there is any way to export data from the Parse.com database (based on conditions) to a PDF format. There does not seem to be any built-in functionality for this but I may be missing something.
The purpose of this is to create a monthly report of new entries into the database.
The only solution I can find is to pull out ParseObjects using a query against a condition (in this case, creation date) and then having to manually extract fields and construct a PDF document using a third-party library.
Although I cannot find any solutions, I feel that this sort of functionality would be commonly required and perhaps I am missing something.
Any help would be appreciated! Thank you.
There is no built in function, you could try to use a JavaScript library in a Cloud Code Background Job to write the file and schedule the job to run once a month, or like you already said, query the data using the API and write the file on your own server/client. That's pretty much your only option at the moment.

Export content from an ecommerce site without using the Backend

I have a site that I'm looking to transfer to Volusion. Importing tabled content into Volusion's a breeze, it's getting it tabled that's an issue. The old site has no real ability to export, nor do I know how to get at it's database. I'm thinking there must be some sort of script I can write to take the content from the frontend and download it in some sort of list that I can put into a CSV, and put into Volusion.
www.twincitygreetings.com
Any suggestions? I'm hoping to get in the image directory as well and download all them for upload to the new site.
You are going to need at the very least a file with product code, product name, weight and price.
Looking at the URL you provided it doesn't appear that the products their follow any type of orderly structure where you can target the images folder or products based on a known piece of information like a products code. Unless the back-end has some type of product export function you may have no choice but to recreate it from scratch.
I don't know if you solved this yet or not, but I would suggest scraping the data providing you have the information on the old site currently. This can be done easily using vbscript and excel, or if you aren't very savvy at coding you could look at a piece of software called mozenda. There are a whole variety of methods that can be used to scrape data, all of them pretty easy to learn with a bit of research. Basically you write a script that will crawl your dom and extract the data (to xml works best in my experience)
Hope this helps.

SSIS: Finding Table Used in Other Package(s)/ Integration

I did see some other posts on this, but they were rather old and there does not appear to be any solutions at this point.
I'm trying to determine where a particular table(s) that SSIS is loading during a monthly job is being used in other packages. The package that loads these tables have in the past several months been taking much longer than before, and I'm trying to see if I can eliminate this load all together.
I just happened to check the Allocation packages in our database to see how the tables were being used, and discovered that I can't find anywhere when/where those tables are being used. Is there a function or query I can run in SSMS or elsewhere to determine how to find this information?
Thx in advance - please let me know if I need to clarify something.
The packages are just XML files. If you have the packages somewhere on your file system you can use any program that searches through text files.
I'm not sure about older SSIS projects but with an SSIS project in Data Tools for SQL Server 2012 you can just use the build in search function to search through your entire solution. It will also search in the XML of all the packages.
If you don't have this particular information saved anywhere already in your documentation then I think you are going to have some difficulty in finding an accurate way to retrieve this information. However, there are a few automated data collection options that might help you get most of the way there.
The first option is that because all SSIS Packages are essentially glorified XML that is being fed into an engine you can perform a patterned search on the packages like GREP to look for that particular table name. Any packages that dynamically retrieve and build the table name though would not be found through this method.
Another option would be to run a server side SQL trace with a pattern match based on the table name(s) and limited to the host or application name of SSIS. Run over the course of a month or so would make for a fairly accurate list.
I haven't used it myself, but the DOC xPress tool from PragmaticWorks might be what you're looking for.

Resources