Salesforce Metadata apis - salesforce

I want to rerieve list of Metadata Component's like ApexClass using Salesforce Metadata API's.
I'm getting list of all the Apex Classes(total no is 2246) that are on the Salesforce using the following Code and its taking too much time to retrieve these file names:
ListMetadataQuery query = new ListMetadataQuery();
query.type = "ApexClass";
double asOfVersion = 23.0;
// Assume that the SOAP binding has already been established.
FileProperties[] lmr = metadataService.listMetadata(
new ListMetadataQuery[] { query }, asOfVersion);
if (lmr != null)
{
foreach(FileProperties n in lmr)
{
string filename = n.fileName;
}
}
My requirement is to get list of Metadata Components(Apex Classes) which are developed by my organizasion only so that i can get the Salesforce Metadata Components which are relevant to me and possibly can save my time by not getting all the classes.
How can I Achieve this?
Reply as soon as possible.
Thanks in advance.

I've not used the meta-data API directly, but I'd suggest either trying to filter on the created by field, or use a prefixed name on your classes so you can filter on that.
Not sure if filters are possible though! As for speed, my experience of using the Meta-Data API via Eclipse is that it's always pretty slow and there's not much you can do about it!

Related

How to generate SAS token for Azure blob container in Angular?

I am generating SAS (shared access signature) token for my Azure blob containers with private access level using .net core application and it is working fine.
Code:
private static string GetContainerSasUri(CloudBlobContainer container, string storedPolicyName = null)
{
string sasContainerToken;
if (storedPolicyName == null)
{
SharedAccessBlobPolicy adHocPolicy = new SharedAccessBlobPolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-1),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2),
Permissions = SharedAccessBlobPermissions.Read
};
sasContainerToken = container.GetSharedAccessSignature(adHocPolicy, null);
}
else
sasContainerToken = container.GetSharedAccessSignature(null, storedPolicyName);
return container.Uri + sasContainerToken;
}
Now I want to do the same using Angular (generating SAS tokens). I've googled it and found some links but none of them explain this in detail. Is there anyway to do this?
Now I want to do the same using Angular (generating SAS tokens). I've
googled it and found some links but none of them explain this in
detail. Is there anyway to do this?
Simple answer is no, you can't create SAS tokens from a client-side library like Angular. Well, technically you can but creation of SAS tokens require storage account key and creating it from the client side would mean that you would be exposing your storage account key to everyone who's using your application.
Better option would be to make an API call to your back-end service and have that service create a SAS token for you. That way you can keep your account key safe.

How to efficiently organise firestore methods and connections

I'm current working with Firebase firestore and Next JS. I've googled how to organise a firestore project but most of them (all actually) aren't scalable.
What I have tried to do is to have a folder containing all the Firebase-related components such as configurations and utility methods. I found the most challenging part is to write a general-purpose function to get the collection/document ref that applies all the supported methods, namely .orderBy(), .limit(), .where() and .doc(). It's also really tough to write a transformer that transforms the data returned by the database to another format.
Here's what I have implemented:
Where getDocRef.js is a helper function that puts all those methods mentioned above together, getOnce.js and observe.js expose methods that interact with the database and db.js contains the configurations.
Also, for anyone who interested, here's my naive solution for the .getDocRef() function:
import db from '../db';
/*
Options:
- ref: Specify the ref of a document
- collectionName: Specify the collection name
- queryArgs: Specify the arguments to be passed down to .where()
- orderByArgs: Specify the arguments to be passed down to .orderBy()
- limit: Specify the fetching limit
- docName: Specify the document name/id
*/
export default options => {
const { ref, collectionName, queryArgs, orderByArgs, limit, docName } = options;
if (ref != null) return ref;
const initRef = db.collection(collectionName);
if (docName != null) return initRef.doc(docName);
if (queryArgs != null) {
if (orderByArgs != null) {
if (limit != null)
return initRef
.where(...queryArgs)
.orderBy(...orderByArgs)
.limit(limit);
return initRef.where(...queryArgs).orderBy(...orderByArgs);
}
return initRef.where(...queryArgs);
}
return initRef;
};
So, I would love to know if my current implementation of Firebase is okay. If not, what project structure should I apply? How should I improve my current structure to make it more efficiently? And last but not least is there an alternative to my naive JS solution posted above? Thanks in advance
My personal approach:
Extract all credentials to .env with dotenv package
A directory call /lib/db and two files here:
init.js to initialise the Firebase/firestore
Another class with some methods for CRUD
If your project is getting big, I suggest to extract every collection's related method to a file in /lib/db and organise them there(somehow like state managements).

Make a solr query from Geotools through geoserver

I come here because I am searching (like the title mentionned) to do a query from geotools (through geoserver) to get feature from a solr index.
To be more precise :
I saw on geoserver user manual that i can do query on solr like this in http :
http://localhost:8080/geoserver/wfs?service=WFS&version=1.1.0&request=GetFeature
&typeName=mySolrLayer
&format="xxx"
&viewparams=q:"mySolrQuery"
The important part on this URL is the viewparams that I want to use directly from geotools.
I have already test this case (this is a part of my code):
url = new URL(
"http://localhost:8080/geoserver/wfs?request=GetCapabilities&VERSION=1.1.0";
);
Map<String, String> param = new HashMap();
params.put(WFSDataStoreFactory.URL.key, url);
param.put("viewparams","q:myquery");
Hints hints = new Hints();
hints.put(Hints.VIRTUAL_TABLE_PARAMETERS, viewParams);
query.setHints(hints);
...
featureSource.getFeatures(query);
But here, it seems to doesn't work, the url send to geoserver is a normal "GET FEATURE" request without the viewparams parameter.
I tried this with geotools-12.2 ; geotools-13.2 and geotools-15-SNAPSHOT but I didn't succeed to pass the query, geoserver send me all the feature in my database and doesn't take "viewparams" as a param.
I need to do it like this because actually the query come from another program and I would easily communicate this query to another part of the project...
If someone can help me ?
There doesn't currently seem to be a way to do this in the GeoTool's WFSDatastore implementations as the GetFeature request is constructed from the URL provided by the getCapabilities document. This is as the standard requires but it may be worth making a feature enhancement request to allow clients to override this string (as QGIS does for example) which would let you specify the additional parameter in your base URL which would then be passed to the server as you need.
Unfortunately the WFS module lives in Unsupported land at present so unless you have resources to work on this issue yourself and can provide a PR to implement it there is not a great chance of it being implemented.

Parsing Swagger JSON data and storing it in .net class

I want to parse Swagger data from the JSON I get from {service}/swagger/docs/v1 into dynamically generated .NET class.
The problem I am facing is that different APIs can have different number of parameters and operations. How do I dynamically parse Swagger JSON data for different services?
My end result should be list of all APIs and it's operations in a variable on which I can perform search easily.
Did you ever find an answer for this? Today I wanted to do the same thing, so I used the AutoRest open source project from MSFT, https://github.com/Azure/autorest. While it looks like it's designed for generating client code (code to consume the API documented by your swagger document), at some point on the way producing this code it had to of done exactly what you asked in your question - parse the Swagger file and understand the operations, inputs and outputs the API supports.
In fact we can get at this information - AutoRest publically exposes this information.
So use nuget to install AutoRest. Then add a reference to AutoRest.core and AutoRest.Model.Swagger. So far I've just simply gone for:
using Microsoft.Rest.Generator;
using Microsoft.Rest.Generator.Utilities;
using System.IO;
...
var settings = new Settings();
settings.Modeler = "Swagger";
var mfs = new MemoryFileSystem();
mfs.WriteFile("AutoRest.json", File.ReadAllText("AutoRest.json"));
mfs.WriteFile("Swagger.json", File.ReadAllText("Swagger.json"));
settings.FileSystem = mfs;
var b = System.IO.File.Exists("AutoRest.json");
settings.Input = "Swagger.json";
Modeler modeler = Microsoft.Rest.Generator.Extensibility.ExtensionsLoader.GetModeler(settings);
Microsoft.Rest.Generator.ClientModel.ServiceClient serviceClient;
try
{
serviceClient = modeler.Build();
}
catch (Exception exception)
{
throw new Exception(String.Format("Something nasty hit the fan: {0}", exception.Message));
}
The swagger document you want to parse is called Swagger.json and is in your bin directory. The AutoRest.json file you can grab from their GitHub (https://github.com/Azure/autorest/tree/master/AutoRest/AutoRest.Core.Tests/Resource). I'm not 100% sure how it's used, but it seems it's needed to inform the tool about what is supports. Both JSON files need to be in your bin.
The serviceClient object is what you want. It will contain information about the methods, model types, method groups
Let me know if this works. You can try it with their resource files. I used their ExtensionLoaderTests for reference when I was playing around(https://github.com/Azure/autorest/blob/master/AutoRest/AutoRest.Core.Tests/ExtensionsLoaderTests.cs).
(Also thank you to the Denis, an author of AutoRest)
If still a question you can use Swagger Parser library:
https://github.com/swagger-api/swagger-parser
as simple as:
// parse a swagger description from the petstore and get the result
SwaggerParseResult result = new OpenAPIParser().readLocation("https://petstore3.swagger.io/api/v3/openapi.json", null, null);

How can I upload a file to a Sharepoint Document Library using Silverlight and client web-services?

Most of the solutions I've come across for Sharepoint doc library uploads use the HTTP "PUT" method, but I'm having trouble finding a way to do this in Silverlight because it has restrictions on the HTTP Methods. I visited this http://msdn.microsoft.com/en-us/library/dd920295(VS.95).aspx to see how to allow PUT in my code, but I can't find how that helps you use an HTTP "PUT".
I am using client web-services, so that limits some of the Sharepoint functions available.
That leaves me with these questions:
Can I do an http PUT in Silverlight?
If I can't or there is another better way to upload a file, what is it?
Thanks
Figured it out!! works like a charm
public void UploadFile(String fileName, byte[] file)
{
// format the destination URL
string[] destinationUrls = {"http://qa.sp.dca/sites/silverlight/Answers/"+fileName};
// fill out the metadata
// remark: don't set the Name field, because this is the name of the document
SharepointCopy.FieldInformation titleInformation = new SharepointCopy.FieldInformation
{DisplayName =fileName,
InternalName =fileName,
Type = SharepointCopy.FieldType.Text,
Value =fileName};
// to specify the content type
SharepointCopy.FieldInformation ctInformation = new SharepointCopy.FieldInformation
{DisplayName ="XML Answer Doc",
InternalName ="ContentType",
Type = SharepointCopy.
FieldType.Text,
Value ="xml"};
SharepointCopy.FieldInformation[] metadata = { titleInformation };
// initialize the web service
SharepointCopy.CopySoapClient copyws = new SharepointCopy.CopySoapClient();
// execute the CopyIntoItems method
copyws.CopyIntoItemsCompleted += copyws_CopyIntoItemsCompleted;
copyws.CopyIntoItemsAsync("http://null", destinationUrls, metadata, file);
}
Many Thanks to Karine Bosch for the solution here: http://social.msdn.microsoft.com/Forums/en/sharepointdevelopment/thread/f135aaa2-3345-483f-ade4-e4fd597d50d4
What type of SharePoint deployment and what version of silverlight? If say it is an intranet deployment you could use UNC paths to access your document library in sharepoint and the savefiledialog/openfiledialog available in Silverlight 3.
http://progproblems.blogspot.com/2009/11/saveread-file-from-silverlight-30-in.html
or
http://www.kirupa.com/blend_silverlight/saving_file_locally_pg1.htm
Silverlight has restrictions on what it can do with local files, though I've read that silverlight 4 has some changes.
http://www.wintellect.com/CS/blogs/jprosise/archive/2009/12/16/silverlight-4-s-new-local-file-system-support.aspx

Resources