How to dynamically create and delete namespaces in Cloudflare Workers KV - database

I am trying to create a mini database that may have to be dynamic in workers. I have a workers backend going and a couple kv namespaces. When the app is live I would like the ability to create a new space and then populate it or delete a namespace and its items. I know how to add, read and delete kv-pairs in existing namespaces within the script using NAMESPACE.get and so on. However I do not know how to create or delete a namespace within the script. I know I can go onto the workers dashboard or do it from wrangler CLI but I really need the application to make its own namespaces when its live.
How can I do this?
Was told to post here from webmaster.stackexchange.
https://webmasters.stackexchange.com/questions/139465/how-to-dynamically-create-and-delete-namespaces-in-cloudflare-workers-kv

You can manage Workers KV namespaces via the API: https://api.cloudflare.com/#workers-kv-namespace-create-a-namespace
Workers bindings can't be dynamically managed from within a Worker, though. You'd need to redeploy your worker itself to add or remove KV bindings to use the namespace efficiently from your Worker.
I wouldn't recommend doing anything above.
I would make an abstraction using a single KV namespace. Add your own prefix to each key access, depending on what "namespace" you want to access.

Related

Adding WebAPI to Composite-C1

I am really impressed with CompositeC1 and the ability to add data types on the fly, reference other data types with a foreign key relationship, etc. The built in functions are also really handy.
Ideally, I could create a separate Web API project that would be able to use all of the functions that are generated on the fly in CompositeC1 and expose them out as REST services.
Should I do this as a separate project referencing the dll's in CompositeC1's bin directory? Those dll's are regenerated each time a custom data type is altered... would that cause an issue?
I just wrote a post tonight that I think answer the question.
http://www.s-innovations.dk/Blog/2013/06/25/Mobile-Services-for-Composite-C1--Idea-Creation
I make an API by tapping into the C1 System and get my data from there. No need to worry about dlls being generated and such.
But you can create your own API in a seperate dll also. You can see from my post the basic steps of getting an API up and running.
You could get a problem if you make a WebAPI that exposes something that a user deletes in the console, then your dll breaks? The idea with my project is to make it dynamic expose types configured from within the console. So if someone deletes a type, then it also get removed from my API.

Salesforce: script to create custom object and fields

Is there a way to create custom objects and fields by using script or IDE ?
Salesforce is very easy to use, however, it's so time-consuming to create so many fields on Web interface. So, just wonder if there's ways to use script or IDE to create objects and fields in Salesforce?
You're looking for the Metadata API, or already developed tools which use the metadata api.
http://www.salesforce.com/us/developer/docs/api_meta/Content/meta_intro.htm
http://www.salesforce.com/us/developer/docs/api_meta/index.htm
Though using it directly will still require some developement, which may not save you much time. you get metadata in XML, but would still need to process it to what you want to achieve.
Somewhat also depend on the nature of what you want to do. I for instance had a requirement today for 150 custom labels based on an input file. Was much faster to generate metadata XML than to ever do that in the web interface. Deployed the metadata using the force.com IDE.

What is the correct way to solve the ambiguous reference issue in WCF services?

Project Structure
I have a silverlight project SLProj, that references a silverlight class library project called ServiceClients. ServiceClients has two wcf service references, Svc1.svc and Svc2.svc. Both Svc1.svc and Svc2.svc are in two different WCF projects which use the same set of DataContracts which are again in a different class library project called MyDataContracts.dll.
Problem description
Now in my ServiceClients project I get an ambiguous reference issue when I need to use a datacontract class which is present in both the service references. If this were a winforms or webforms project, I could reference the MyDataContracts.dll and reuse the common types. But since, this MyDataContracts.dll was built using a non silverlight class library, it can't be referenced in the silverlight project
Workaround...
I am not sure if this below is the best method to go about taking care of this issue. Can anybody let me know if there is a cleaner way to solve this problem, or is this the best way we have so far?
create a single service reference.
click the 'show all files' button in the solution explorer
drill into the service reference and find Reference.svcmap and open it
find the MetadataSources section
add a second line to include the address to your second service. for example:
MetadataSource Address="http://address1.svc" Protocol="http" SourceId="1"
MetadataSource Address="http://address2.svc" Protocol="http" SourceId="2"
save, close, and update service reference.
Use Automapper
Map the DataContracts with AutoMapper.
You will have to invest some time in understanding AutoMapper and reworking your application. Also AutoMapper adds overhead because all data objects will be mapped. But first you will have a clean solution without hacks and second you gain a decoupled and simple data object layer just for your client. Once done you can forget the mapping but you stay flexible for future changes.
If you never have worked with Automapper it's important to play around with it before starting. Automapper is special and needs some time to familiarise with it.
So there. These are the rough steps:
1. Create a subdirectory and sub-namespace Data and copy the DataContracts. Remove the attributes and properties your client doesn't need because these mapped classes live only in your client. You can also change some types or flatten some complex properties.
2. Create an AutoMapperInit.cs like described at Automapper (read the Getting Started Guide). Use the conflicting references like this:
using ref1 = YourProjectServiceReference1;
using ref2 = YourProjectServiceReference2;
3. Wrap the service client like this:
Example GetExample() {
return AutoMapper.Map<ref1.Example, Example>(ref1.YourService.GetExample());
}
The wrapper also needs the same using directives as in step 2.
4. In this wrapper add a static initializer like this (assuming your wrapper class is called Wrapper):
static Wrapper() {
AutoMapperInit.CreateMaps();
}
5. Omit service references in the client and use using YourClient.Data;, the namespace you created in step 1.
Your client is now decoupled from the service and you don't have conflicts anymore.
Disclaimer: I am not affiliated with AutoMapper. I just used it in a project with a similar problem and am happy with it and wanted to share my experience.
Your workaround is actually quite OK. We've used it in several projects like this with 3 service references. It is actually a workaround for the IDE which for some reason only allows to select one service to create a service reference at a time.
Another thing you could try-out is to multi-target your shared contract to .NET and Silverlight, using the same codebase. Details on how to do such thing is described in http://10rem.net/blog/2009/07/13/sharing-entities-between-wcf-and-silverlight. Might be more work but feel less hacky.

How to generate new table related code in existing Ria Service without deleting it

How to generate new table related code in existing Ria Service without deleting it. Please Suggest best Practice.
I am having one domain service. I have modified lot of auto generated code and meta data. Now I want to include couple of more tables auto generated code without deleting it.
Your best option is probably to add a second, temporary doamian service that includes those tables and then copy the code over to the existing domain service. Remove the temporary domain service once you are finished.
The other option is to hand craft the new code. Use the existing code for the other tables as an example.
My understanding is that the generator is only intended to be used for the creation of new domain services.

Best strategy to initially populate a Grails database backend

I'd like to know your approach/experiences when it's time to initially populate the Grails DB that will hold your app data. Assuming you have CSVs with data, is is "safer" to create a script (with whatever tool fits you) that:
1.-Generates the Bootstrap commands with the domain classes, run it in test or dev environment and then use the native db commands to export it to prod?
2.-Create the DB's insert script assuming GORM's version = 0 and incrementing manually the soon-to-be autogenerated IDs ?
My fear is that the second approach may lead to inconsistencies for hibernate will have the responsability for the IDs generation and there may be something else I'm missing.
Thanks in advance.
Take a look at this link. This allows you to run groovy scripts in the normal grails context giving you access to all grails features including GORM. I'm currently importing data from a legacy database and have found that writing a Groovy script using the Groovy SQL interface to pull out the data then putting that data in domain objects appears to be the easiest thing to do. Once you have the data imported you just use the commands specific to your database system to move that data to the production database.
Update:
Apparently the updated entry referenced from the blog entry I link to no longer exists. I was able to get this working using code at the following link which is also referenced in the comments.
http://pastie.org/180868
Finally it seems that the simplest solution is to consider that GORM as of the current release (1.2) uses a single sequence for all auto-generated ids. So considering this when creating whatever scripts you need (in the language of your preference) should suffice. I understand it's planned for 1.3 release that every table has its own sequence.

Resources