I need to download Keystore and use it while creating the source connector, currently, I am overriding open method but this gets called for each of my source connectors.
#Override
public void open(Configuration configuration) throws Exception {
// do few things like download Keystore to default path etc
super.open(configuration)
}
Is there an option to init a few pre stuff as soon as the task manager came up?
Related
EDIT2: I have managed to get past the GlobalDatastoreConfig has already been set error. I managed to pinpoint all the locations that were getting called before the init function. They were in static space in some weird files.
I have now pointed ALL DatastoreServiceFactory.getDatastoreService() to a new static function I've created in a file called Const.java.
private static boolean hasInit = false;
public static DatastoreService getDatastoreService() {
if(!hasInit) {
try {
CloudDatastoreRemoteServiceConfig config = CloudDatastoreRemoteServiceConfig
.builder()
.appId(CloudDatastoreRemoteServiceConfig.AppId.create(CloudDatastoreRemoteServiceConfig.AppId.Location.US_CENTRAL, "gcp-project-id"))
.build();
CloudDatastoreRemoteServiceConfig.setConfig(config);
hasInit = true;
} catch (Exception ignore) {}
}
return DatastoreServiceFactory.getDatastoreService();
}
This returns no errors on the first initialisation. However, I am getting a new error now!
Dec 08, 2022 6:49:56 PM com.google.appengine.api.datastore.dev.LocalDatastoreService init
INFO: Local Datastore initialized:
Type: High Replication
Storage: C:\Users\user\dev\repo\Celbux\core\Funksi179_NSFAS_modules\classes\artifacts\Funksi179_NSFAS_modules_war_exploded\WEB-INF\appengine-generated\local_db.bin
Dec 08, 2022 6:49:56 PM com.google.appengine.api.datastore.dev.LocalDatastoreService load
INFO: Time to load datastore: 20 ms
2022-12-08 18:49:56.757:WARN:oejs.HttpChannel:qtp1681595665-26: handleException / java.io.IOException: com.google.apphosting.api.ApiProxy$CallNotFoundException: Can't make API call urlfetch.Fetch in a thread that is neither the original request thread nor a thread created by ThreadManager
2022-12-08 18:49:56.762:WARN:oejsh.ErrorHandler:qtp1681595665-26: Error page too large: 500 org.apache.jasper.JasperException: com.google.apphosting.api.ApiProxy$RPCFailedException: I/O error
Full stacktrace: https://pastebin.com/YQ2WvqzM
Pretty sure the first of the errors is invoked from this line:
DatastoreService ds = Const.getDatastoreService();
Key ConstantKey = KeyFactory.createKey("Constants", 1);
Entity Constants1 = ds.get(ConstantKey) // <-- This line.
EDIT1: I am not using Maven. Here are the .jars I have in WEB-INF/lib
appengine-api-1.0-sdk-1.9.84.jar
appengine-api-labs.jar
appengine-api-labs-1.9.76.jar
appengine-api-stubs-1.9.76.jar
appengine-gcs-client.jar
appengine-jsr107cache-1.9.76.jar
appengine-mapper.jar
appengine-testing-1.9.76.jar
appengine-tools-sdk-1.9.76.jar
charts4j-1.2.jar
guava-11.0.2.jar
javax.inject-1.jar
json-20190722.jar
Original Question:
The company that I'm working at have a legacy GCP codebase written in Java. This codebase uses the appengine-api-1.0-sdk.jar libary. Upon running this CloudDatastoreRemoteServiceConfig code in the very first place that our DatastoreService gets initialised, it says that the config has already been set.
If someone can shed light on how to get this outdated tech connected to the Cloud via localhost, I'll be most grateful!
web.xml
<filter>
<filter-name>NamespaceFilter</filter-name>
<filter-class>com.sintellec.funksi.Filterns</filter-class>
</filter>
<filter-mapping>
<filter-name>NamespaceFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
Code
public class Filterns implements javax.servlet.Filter {
public void init(FilterConfig filterConfig) {
try {
CloudDatastoreRemoteServiceConfig config = CloudDatastoreRemoteServiceConfig
.builder()
.appId(CloudDatastoreRemoteServiceConfig.AppId.create(CloudDatastoreRemoteServiceConfig.AppId.Location.US_CENTRAL, "gcp-project-id"))
.build();
CloudDatastoreRemoteServiceConfig.setConfig(config);
DatastoreService ds = DatastoreServiceFactory.getDatastoreService();
} catch (Exception e) {
System.out.println(e);
return;
}
this.filterConfig = filterConfig;
}
}
I got this code snippet from here.
Was thinking a few ideas:
Perhaps there's GCP code that's called before our Java code which initialises the Local DB
Perhaps I need to set a global environment variable to point this old emulator to a Cloud Configuration instead
Only problem is I have no idea what to do from here, hoping someone has experience on the legacy Java library here.
To clarify; I am trying to get this outdated GCP Java codebase (appengine-api-1.0-sdk.jar) to connect to Cloud Datastore, NOT use the Local Datastore Emulator. This is so I can debug multiple applications that all access the same Cloud DB
It is very difficult to say especially with that amount of code and we can only guess but, as you indicated, probably some code is initializing your DataStore configuration, probably the SDK itself. You could try setting a breakpoint in the setConfig method of CloudDatastoreRemoteServiceConfig and analyze the call stack.
In any way, one think you could also try is not performing such as initialization in your code, delegating to Application Default Credentials the authentication of your client libraries.
For local development you have two options to configure such as Application Default Credentials.
On one hand, you can use user credentials, i.e., you can use the gcloud CLI to authenticate against GCP with an user with the required permissions to interact with the service, issuing the following command:
gcloud auth application-default login
Please, don't forget to revoke those credentials when you no longer need them:
gcloud auth application-default revoke
On the other, you can create a service account with the necessary permissions and a corresponding service account key, and download that key, a JSON file, to your local filesystem. See this for instructions specific to DataStore. Then, set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the downloaded file with your service account key.
Again, a word of caution: take care of the downloaded service account key file and never put it under version control because anyone with that file could assume the permissions granted to the service account.
You code should work without further problem when running in GCP because probably you will be using a service that supports attaching a service account which means that Application Default Credentials are provided by the GCP services per se.
As the title says, I am trying to use dotnet ef database update from the command line and getting the error Format of the initialization string does not conform to specification starting at index 0.
From searching around this site and the internet, everything points to the connection string being wrong, but the connection string is working fine to compile and run the application. I am trying to add Identity to the project so I can have users with passwords, and am trying to follow the Deep Dive tutorial on pluralsight, but when it gets to this part, the code fails.
My connection string in appsettings.json is
"ConnectionStrings": {
"DefaultConnection": "Server=PTI-VWS12-002;Database=EPDM_TestVault;Trusted_Connection=true;MultipleActiveResultSets=true;"
},
The code in my Startup.cs is:
var migrationAssembly = typeof(Startup).GetTypeInfo().Assembly.GetName().Name;
services.AddDbContext<ApplicationDbContext>(options =>
options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection"), sql=> sql.MigrationsAssembly(migrationAssembly)));
though i've also tried it without the migration assembly as well. I'm really not sure what could be wrong with my connection string.
EDIT: My constructor:
public IConfiguration Configuration { get; }
public Startup(IConfiguration configuration) { Configuration = configuration; }
And my constructor has the default for ASP.NET CORE 2.1
public static IWebHostBuilder CreateWebHostBuilder(string[] args) => WebHost.CreateDefaultBuilder(args) .UseStartup<Startup>();
EDIT 2: Solved.
I'm still not sure what I did wrong in my project, but i got the Identity tables to generate using the official Asp.NET sample project library over here https://github.com/aspnet/Docs. Using the exact migration file from the IdentityDemo, and plopping in my connection string from above, I was able to create the Identity tables in my database.
You first need to configure IConfiguration using IConfigurationBuilder in Constructor of startup or in Program.cs before Kestrel Server startup.
var builder = new ConfigurationBuilder()
.SetBasePath(env.ContentRootPath)
.AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
.AddEnvironmentVariables();
_configuration = builder.Build();
I'm still not sure what I did wrong in my project, but i got the Identity tables to generate using the official Asp.NET sample project library over here https://github.com/aspnet/Docs. Using the exact migration file from the IdentityDemo, and plopping in my connection string from above, I was able to create the Identity tables in my database.
EDIT: it still doesn't run in context of my program.
I have a message that comes in via a queue. I want to send that message off to a signing service. This service returns a signature. I then want to put the original message and the signature message into a Zip file as two seperate Zip entries. I want to asked for the world on a stick and do this as a blueprint, and entirely via XML with no compiled java code (other than my signing microservice which is already built and running in our infrastructure).
Ideas?
Looking at the docs and playing around with it I think I can...maybe.
It seems the default aggregators might not do quite what I need for this usecase.
Just found the solution. Below is a PoC:
context.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("timer://foo?fixedRate=true&period=1s").to("direct:start");
from("direct:start").setBody().simple("hello").multicast(new ZipAggregationStrategy(true, true)).to("direct:a", "direct:b").end().to("file://target?fileName=any.zip");
from("direct:a").setHeader("CamelFileName").simple("data.txt").to("log:mylog");
from("direct:b").setHeader("CamelFileName").simple("signature.txt").to("http://mysignatureservice");
}
});
How can we run application at backend even if the application in android or i-phone is closed
You could perform a BackgroundFetch, but it's not guaranteed to always be active due to OS limitation and ability to manually turn the feature off by a user.
public class MyApplication implements BackgroundFetch {
#Override
public void performBackgroundFetch(long deadline, Callback<Boolean> onComplete) {
//perform the background activity here
onComplete.onSucess(Boolean.TRUE);
}
}
You can find a sample demo code here.
I'm developing a portlet that is deployed as WAR. Database models are created by service builder. How can I insert initial data models to the database when the WAR is installed?
Add to the project a portal.properties file with the property:
application.startup.events=com.my.project.MyStartupAction
and impliment the startup import as extension of SimpleAction:
package com.my.project;
import com.liferay.portal.kernel.events.ActionException;
import com.liferay.portal.kernel.events.SimpleAction;
public class MyStartupAction extends SimpleAction {
#Override
public void run(String[] arg0) throws ActionException {
if(startupDataNotExist()) {
createStartupData();
}
}
...
You can do this either as StartupAction, executed on startup of the plugin (read: during deployment AND during subsequent server starts) or as an UpgradeAction.
A good example for this is the sevencogs-hook that comes with Liferay CE and has the source code included. This is implemented as an UpgradeAction - e.g. on first start your database content will be "upgraded" to contain the sevencogs sample data.
Also you can do it with UpgradingProcess. Here are step-by-step instructions