How to get the goal name or goal ID from Google Analytics in SSIS? - sql-server

Is it possible to get the goal name or goal ID from the Google Analytics Source task (ZappySys) in SSIS?
I've been searching for it for a long time now and I just can't find it in the Dimensions or Metrics from the task.
I also tried to use the eventAction Dimension from the task, but the goalCompletionAll is not showing the same results from the website.
The Dimensions I've been using to compare are Source, eventAction and the Metric goalCompletionsAll.

You cannot get the name of the goal, but you can get the specific goal value through its id (that you can see in the panel) using ga:goalXXCompletions.
Also, you cannot compare event actions and goals because they are different scopes, the first is at the hit level while the second is at the session level, for this reason you get different numbers.

Related

Limit the number of rows counted in in google data studio

I have a scorecard that looks at the number of URL clicks driven by all queries which works as expected. I am now trying to display the number of clicks driven by the top 10 queries in the scorecard. I was able to limit the number of rows in my table by disabling pagination to show only the top 10 queries but now I'm looking to sum the clicks in a scorecard to provide a quick summary rather than having a table.
I don't think what you want to do is possible dynamically via just the Search Console connector. Google Data Studio does not provide any way to calculate rankings via calculated fields, so there's no way for you to know which query is in the top 10 without looking at a sorted table. A few imperfect alternatives (roughly in order of increasing complexity):
You apply a filter so that the score card only aggregates values above a certain threshold. This would be hardcoded, so you would be filtering on the Clicks (ie aggregate all URL clicks above 100)
You apply a filter to the score card so that it only aggregates clicks from the top 10 URLs. This would not be a dynamically updating filter, so you'd have to look at the table to see which URLs are in the top 10, which would change as time goes on. This would end up being a filter like: "Include URLS Contains www.google.com,www.stackoverflow.com"
If you do not mind using google sheets as an intermediary, you could dump your Search Console data into a spreadsheet so that you can manipulate it however you like and then use the spreadsheet as the data source for data studio (as opposed to the Search Console connector). It looks like there might be some addons out there that you can use out of the box although I haven't used it myself, so not sure how difficult it is. Alternatively, you can build something out yourself via the Google Script and the Search Console API
You could build a custom Data Studio Community Visualization. (BTW just because they are called 'Community Visualizations' does not mean you have to make them publicly available.) Essentially here, you would be building a scorecard like component that aggregates the data according to your own rules, although this does require more coding experience. (Before you build one, check if something like what you need exists in the gallery, but at a quick glance, I don't see anything that would meet your needs.)

How to store results of an advanced search?

Currently I am trying to add a new functionality to my system in which users will be able to see customs lists of products.
For the creation of these lists I will mostly likely use an algorithm or some criteria that will be used to gather data from my database, or sometimes use hand-picked items.
I wonder what is the best way to do that, in terms of storage and computational time. I was thinking about using an object for my model (something like CustomList), that will store some attributes regarding to this list (that is basically filled with products which are results of an advanced search in my database) and by doing so, store a query string or something like that so it can be reprocessed periodically and if it's personalized for an specific user, run it for every user that requests it.
Example of a query (in natural language): "Select all items that are cheaper than 15 dollars and are designed for the gender of the user X"
I don't know if there is a better way to do that. I wonder how Spotify work with their personalized and custom lists (like Discovery Weekly, Running Musics, Sleepy Monday et cetera).
Should I use a query string and store it on an attribute inside this object on my model? Should I do all of that without an object model (on the fly)? What are the best options? How big companies do that?

How to fetch thousands of data from database without getting slow down?

I want auto search option in textbox and data is fetching from database. I have thousands of data in my database table (almost 8-10000 rows). I know how to achieve this but as I am fetching thousands of data, it will take a lot of time to fetch. How to achieve this without getting slow down? Should I follow any other methodology to achieve this apart from simple fetching methods? I am using Oracle SQL Developer for database.
Besides the obvious solutions involving indexes and caching, if this is web technology and depending on your tool you can sometimes set a minimum length before the server call is made. Here is a jquery UI example: https://api.jqueryui.com/autocomplete/#option-minLength
"The minimum number of characters a user must type before a search is performed. Zero is useful for local data with just a few items, but a higher value should be used when a single character search could match a few thousand items."
It depends on your web interface, but you can use two tecniques:
Paginate your data: if your requirements are to accept empty values and to show all the results load them in block of a predefined size. goggle for example paginates search results. On Oracle pagination is made using the rownum special variable (see this response). Beware: you must first issue a query with a order by and then enclose it in a new one that use rownum. Other databases that use the limit keyword behave in a different way. If you apply the pagination techique to a drop down you end up with an infinite scroll (see this response for example)
Limit you data imposing some filter that limits the number of rows returned; your search display some results only after the user typed at least n chars in the field
You can combine 1 & 2, but unless you find an existing web component (a jquery one for example) it may be a difficult task if you don't have a Javascript knowledge.

accessing large number of records in SOQL

In salesforce, I am having a custom object and there can be million of records in that object in future. I am developing a dashboard using apex and visualforce so i need to access all the records at one time in single query.
The query is: [select count(custom_cases__C), status__c from custom_case__C group by status__c]
So it is accessing all the records at one time and exceeding governer limits.
What can I do to achieve this?
please provide me with the solution and if possible with an example because I am new to salesforce.
It is a shame that salesforce counts an aggregate function not as 1 result, but as the amount of rows aggregated. Which means that if you have over 50000 results, you'll get an error (SOBject query row exceeded, something in that line)
There is an Idea for that, please upvote!
The only workable solution I see for you at the moment is (and also the solution I use myself) is program and schedule a Apex Batch Job to run, and save the results in a new custom object. Then use that custom object as the source for your dashboard.
You can run your page in read only mode. There are certain restrictions on this also. You might want to take a look at the below link:
http://www.salesforce.com/us/developer/docs/pages/Content/pages_controller_readonly_context.htm

Using search server with cakephp

I am trying to implement customized search in my application. The table structure is given below
main table:
teacher
sub tables:
skills
skill_values
cities
city_values
The searching will be triggered with location which is located in the table city_values with a reference field user_id, and city_id . Here name of the city and its latitude and longitude is found under the table cities.
Searching also includes skills, the table relations are similar to city. users table and skill_values table can be related with field user_id in the table skill_values. The table skills and skill_values related with field skill_id in table skill_values.
Here we need find the location of the user who perform this search, and need to filter this results with in 20 miles radius. there are a few other filters also.
My problem is that i need to filter these results without page reload. So i am using ajax, but if number of records increase my ajax request will take a lot of time to get response.
Is that a good idea that if i use some opensource search servers like sphinx or solr for fetching results from server?
I am using CAKEPHP for development and my application in hosted on cloud server.
... but if number of records increase my ajax request will take a lot of time to get response.
Regardless of the search technology, there should be a pagination mechanism of some kind.
You should therefore be able to set the limit or maximum number of results returned per page.
When a user performs a search query, you can use Javascript to request the first page of results.
You can then simply incrementing the page number and request the second, third, fourth page, etc.
This should mean that the top N results always appear in roughly the same amount of time.
It's then up to you to decide if you want to request each page of search results sequentially (ie. as the callback for each successful response), or if you wait for some kind of user input (ie. clicking a 'more' link or scrolling to the end of the results).
The timeline/newsfeed pages on Twitter or Facebook are a good example of this technique.

Resources