How do I display the price of my Coin/Token in the TronLink wallet - cryptocurrency

PS: I offer my full apologies if the content of the question is not appropriate for the site
My problem, in a nutshell, is as follows, I have a coin/token in the Tron network, the latter of which can be found on the CoinGecko website
However, unfortunately, when the coin appears in the TronLink wallet, the following symbol “-” always appears near it in the price, which means that its price is unknown.
I tried searching the internet to find any possible way to show its price in the wallet and unfortunately without any result, unfortunately, TronLink wallet support is very weak.
Is there a solution to that?

I think it's a bit late but tronlink among many other crypto wallets use coinmarketcap api in order to get token/coin prices.
so listing your token inside coinmarketcap is a must.

Related

How to sort the Activity Timeline by Profile/User on an Account in Salesforce

I am a relative novice with the Salesforce interface and platform itself so if I misuse some terminology, I apologize. In my current role, I am undertaking the challenge of learning APEX and one of the tasks that I have been assigned is figuring out how to sort/filter the Activity Timeline by Profile/User on an Account's page. I have been reading up on this topic but haven't found anything concrete. The closest thing to discovering an answer is the following link,
https://trailblazer.salesforce.com/ideaView?id=0873A0000003XdlQAE
however based off of the conversation, I believe the post is referring to the desire to have an already built-in filter beyond Date Range, Activities, and Activity Type. So with that being said, I was wondering if it is possible to:
Filter the Activity Panel by User and if so ...
How can I complete this task, whether through APEX or some other method
The following image is the Activity Timeline that I am referring to, and the names highlighted in yellow are the User/Profiles that I am referring to. My objective is to sort the display by these names instead of the default chronological order per month. Thank you in advance!
bit late to this but as far as I know there's no way to filter the standard component, which is why I built TimelinePlus (you can find it on the AppExchange). It's kindof a complicated build once you get past the basics but it's certainly possible to build this from scratch, there are also SLDS components to help with the styling.

How to download Amazon MWS Customisation fields

I'm looking into how to process customisation fields for Amazon orders and according to their MWS API Docs, if a customer chooses to personalise his order, then a URL to download this data comes down in the Order Item XML's BuyerCustomizedInfo node:
<OrderItem xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<ASIN>ABC123</ASIN>
...
<ConditionSubtypeId>New</ConditionSubtypeId>
<BuyerCustomizedInfo>
<CustomizedURL>https://zme-caps.amazon.com/t/ABC123/ABC123/1</CustomizedURL>
</BuyerCustomizedInfo>
</OrderItem>
My client has given me two such orders to look at, and when I click on those links all I get is
NoSuchURL: Url id 'ABC123' has expired or does not exist!
I know that the ZIP will contain JSON which I will have to parse and may also contain references to SVGs, and that I must also make the code extra robust when dealing with customisation fields.
Am I getting this error because these links are time sensitive or one time use only? Or is it something else?
First off I'm not a Developer, I'm an Amazon Seller - I found your question while doing research as I'm trying to figure out what is possible and sketch a plan for a similar system and then hire a Developer.
I've pasted some info that I had found from the US below - the implementation of Amazon Custom in the European Marketplaces may not be the same as in the US though.
In general it is very hard to get good info on anything to do with Amazon Custom and it seems to have a messed up logic of its own - feel free to ask anything though and I will help if I can.
First of all, make sure you have the most up to date Amazon MWS Orders API SDK. If you don’t and refuse to update, you can make a reports API for orders, and that’ll include the ZIP URL, but you’ll have to parse it and life will be hell.
Next, for the order, call ListOrderItems which you probably already do. You’ll see the customization in the response XML under BuyerCustomizedInfo -> CustomizedURL.
This is a ZIP. Download the zip using CURL, put plenty of checks and fallbacks in place because it will fail sometimes.
Extract the ZIP to a folder. Inside that folder there will be a json file.
Parse that JSON file and you’ll probably know where to go from there for putting that information into your system.
Depending on how you’ve configured your product, there may also be an SVG file that you’ll want to parse to get some customization info. Specially json->{‘version3.0’}->customizationInfo->surfaces (each surface)->areas. Each area should be a text line or image. At least that is how it is for how we’ve set up products.
As always, put lots of checks, try catches, fallbacks, and error alerts.
The links are time sensitive and expire after 6 months I think.
The links should be a little more complex and if that is exactly the link you are seeing it's incorrect.
You don't require any auth to download them and the easiest way to test them is via the MWS Scratchpad.

Google Experiment shows 0 conversions but goal has plenty

I have a website with a booking system on another domain
The cross domain tracking is working fine.
I have set up a goal called "Booking system entered" which successfully tracks visitors hitting the first step in the booking system.
In my new Experiment i have used this goal for conversions. But even though i know there has been several conversions, meaning people going from version A or version B in my experiment to the booking system, the experiment keeps saying 0 conversions. The number of views is correct. But conversions are not counted at all?
The goal destination ("Begins with") is set up with the first part of the URL of the booking system.
The tracking codes on both domains have the following code added (with their domain):
_gaq.push(['_setDomainName', 'DOMAIN NAME HERE'])
And the experiment code has:
_udn = "DOMAIN NAME HERE";
Without knowing the details, I would say that you should check to see that the Google Analytics tracking code is still on the site, where the goals happen.
Also make sure that the cookie domain, path and domain hash in the experiment code is the SAME as the Google Analytics tracking code. That should solve your problems.

Where to get an updated list of video games?

I am currently designing a reviews site for video games similar to gamespot am wondering where and if there is an online database that contains information such as name, publisher, release date etc with an API. I dont really want to have to enter each title manually or let users enter the title manually.
Where do these large sites get information like this? I wouldn't think it would be manually. I know for movies IMDB exists.
How would I go about adding it to my database?
Thanks
May I point you to web scraping?
Be sure to read the section legal issues and on well-behaved bots.
There's always Amazon and their product advertising API. Some older, but interesting code snippets can be found on this page.
If you know Perl, there is an amzing module called WWW::Mechanize
Pretty much you can write a script to get to any website and grab any data you need.
So for example you can go to www.gamespot.com, get list like the one below and put them in your database.
http://www.gamespot.com/games.html?platform=1029&mode=all&sort=views&dlx_type=all&sortdir=asc&official=all&tag=games%3Bfooter%3Bmore

Heuristics to discover spammers/bots (In forums, blogs etc)

The ways I can think of are:
Measure the time between actions.
Compare the posts' content (if they're too similar to each other) or, better yet, only the posted links.
Checking the distribution over a period of time the user is active (if the user is active, say posting once every hour, for a week, then either we have a superman or a bot here).
Some special activity expected: like in stackoverflow, I would expect users to press their user name link (top middle) to see their new answers, comments, questions etc.
(added by chakrit) Number of links in a post.
Not heuristic. Use some async JS for user login. (Just makes life a bit harder on the bot programmer).
(added by Alekc) Not heuristic. User-agent values.
And, How could I forget Google's approach (mentioned down by Will Hartung). Give users the ability to mark someone as Spam, enough Spam votes means this is a Spam user. (calculating what is enough users, is the work here).
Any more ideas?
I might be over estimating the intelligence of bot creators, but number 6 is completely useless against any semi decent bot creator. Using the C# browser control to create your bot would pretty much render 6 useless. From what I've seen with that type of software that's a pretty common approach.
Validating on the useragent is pretty much useless too all of the blog spam I use to get was from bots appearing to be valid web browsers.
I use to get a lot of blog spam. I would literally be deleting hundreds of comments a day. I made use of reCaptcha and now I might get 1 a month.
If you really try to make something like this. I would attempt by doing the following:
User starts off with no ability to post a url.
After X number of posts have been analyzed in relation to the other posts in the thread then give them access to post urls.
The users activity on the site, the post quality, and what ever other factors you deem necessary will be a reputation for that users IP.
Then based the reputation of the IP and the other IPs on the same subnet you can make other decisions on whatever you want.
That was just the first thing that came to mind. Hope it helps.
The number of links in a post.
I believe I've read somewhere that Akismet use the number of links as one of its major heuristics.
And most of spam comments at my blog contains 10+ links in them.
Speaking of which... you just might want to check out the Akismet API itself .. they are extremely effective.
How about a search for spam related keywords in the post body?
Not a heuristic but an effective approach: You can also keep up-to-date with the stats published by StopForumSpam using their APIs.
Time between page visits is common I believe.
I need to add a comment section to my personal site and am thinking of asking people to give me their email address; I'll email them a "publish comment" link.
You might want to check if they've come from a Spam blacklist IP address (See http://www.spamhaus.org/)
There is another answer that suggests using Akismet for detecting spam, which I completely endorse.
However, they are not the only player on the block.
There is TypePad AntiSpam which uses the same heuristics as Akismet, as well as the same API (just a different URL and api key, the structure of the calls is the same). It can be safe to say they pretty much take the same approach as Akismet.
You might also want to check out Project Honeypot. From what I can tell, it can do a lookup based on the IP address of the user, and if it is a known malicious IP, it will tell you (harvester or something like that).
Finally, you can check LinkSleeve which approaches comment spam with what it claims to be a different way. Basically, it checks the links that are being linked to in comments, and based on where the links are going to, makes a determination.
Don't forget the ultimate heuristic: The "Report Spam" button that users can click. If nothing else, this gives you as administrator a chance to update your rule base for stuff that may be slipping through. Of course, you can simply delete the offending post and user right away as well.
I have some doubts about 4° point, anyway i would also add User-Agent. It's pretty easy to fake, but in my experience, about 90% of bots are using Perl as UA
I am sure there is a webservice of some kind that you can get a list of top SEO keywords, check the content for those keywords. if the content is to rich in keywords suspect it as being spam.

Resources