I am using an Azure SQL back-end for my Xamarin mobile application. I am using Easy tables with Node.JS to handle my back-end. I have created a few tables using the easy tables creation wizard, but now I wish to manually migrate some of my old MySQL tables, such as my table named users from another database into my new Azure SQL database.
1) I have used the SqlMigration3 tool to convert the MySQL tables into Azure SQL, changed the schema to be the same as the Easy Table's schema, and they are now in my Azure SQL Database
2) I have created the Node.JS files users.js and users.json in my App Service Editor and now my table is showing in easy tables
users.js
var table = module.exports = require('azure-mobile-apps').table();
// table.read(function (context) {
// return context.execute();
// });
// table.read.use(customMiddleware, table.operation);
users.json
{
"softDelete" : true,
"autoIncrement": false,
"insert": {
"access": "anonymous"
},
"update": {
"access": "anonymous"
},
"delete": {
"access": "anonymous"
},
"read": {
"access": "anonymous"
},
"undelete": {
"access": "anonymous"
}}
The users table is showing up in Easy Tables but there is no data showing, despite the data displaying in SQL Server Management Studio 2017. Have I missed a step? What am I doing wrong?
In my test, I am also unable to see the data from Azure portal when I do the same steps as yours. I think the portal needs time to load the data since it might have many records.
Since you set Read permission to Allow anonymous access, I would recommend accessing the table via REST API:
https://[YourAppName].azurewebsites.net/tables/users?ZUMO-API-VERSION=2.0.0
Related
I have a Azure Cognitive Search Service and trying to Connect to an Azure SQL Table.
The Azure SQL has Public Access disabled and a Private Link is created.
I have created a Azure Cognitive Search Service and a Private Link for the same.
The Search service has Managed Identity enabled. The Managed Identity has been given access to the SQL Database and Server.
Now when I try to create a data source the error being returned is
Failed to create data source "search-ds", error: "Failed to fetch"
What am I doing wrong?
Thanks in advance.
BR
This documentation topic gives the most comprehensive tutorial to use private link with Azure Cognitive Search indexers: https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private?tabs=portal-create%2Cportal-status
It has a troubleshooting section near the end.
One of the things it mentions that seems easy to miss is the configuration of the indexer to work in private mode:
{
"name": "indexer",
"dataSourceName": "your-datasource",
"targetIndexName": "index",
"parameters": {
"configuration": {
"executionEnvironment": "private"
}
},
"fieldMappings": []
}
Beside the private link aspect, the regular advice also applies, and a lot of things can prevent the indexer from connecting to the data source. Here's another topic that covers other possible causes: https://learn.microsoft.com/en-us/azure/search/search-indexer-troubleshooting
I am using serenity with postgres and I have generated a new project using visual studio 2019.
I have followed the tutorial of how to make the app connect with Postgresql.
I have created a new database and user in PgAdmin.
I have enabled the app to run migrations when I run my app.
Here is a sample of my connection string.
"Data": {
"Default": {
"ConnectionString": "Server=localhost; Port=5432; User Id=kap_dev; Database=kap_db; Password=kapap_password;",
"ProviderName": "Npgsql"
}
However, I get the error
PostgresException: 3D000: database "kap_dev" does not exist.
The issue is that kap_dev is a user and not a database.
I even posted this error to their git-issues but serenity have not responded with a valid answer.
might be a typo, no space between User and Id
You have to create a database which name is the name of your user "kap_dev".
I have two SQL Server instances, one being an on-premise SQL Server and the other being Azure SQL Server instance. Some of the tables in the Azure SQL Server database have a few columns which contain data from the on-premise SQL Server database (although the table schemas are different).
We need to make sure that whenever new entries are added into the on-premises SQL Server database, the corresponding entries should get inserted in the Azure SQL Server database as well.
What is the best way to do this?
You can create your own code using Sync Framework to specify what specific tables you want sync.
using (SqlConnection sqlServerConn =
new SqlConnection(LocalSQLServerConnectionString))
{
using (SqlConnection sqlAzureConn =
new SqlConnection(RemoteSQLAzureConnectionString))
{
DbSyncScopeDescription myScope =
new DbSyncScopeDescription(scopeName);
DbSyncTableDescription Customer =
SqlSyncDescriptionBuilder.GetDescriptionForTable("SalesLT.Customer", sqlServerConn);
DbSyncTableDescription Product =
SqlSyncDescriptionBuilder.GetDescriptionForTable("SalesLT.Product", sqlServerConn);
// Add the tables from above to the scope
myScope.Tables.Add(Customer);
myScope.Tables.Add(Product);
The next section of code sets up the local on-premise SQL Server for provisioning. If the SQL Server already contains the table schemas and data then what does it have to do? The Synchronization Framework uses both databases as data storage to store configuration information, and state information about the current status of the synchronization. So the provisioning creates tables on your local SQL Server to store this information.
// Setup SQL Server for sync
SqlSyncScopeProvisioning sqlServerProv =
new SqlSyncScopeProvisioning(sqlServerConn, myScope);
if (!sqlServerProv.ScopeExists(scopeName))
// Apply the scope provisioning.
sqlServerProv.Apply();
The next section of code does the same thing for the remote SQL Database server. However, it also creates the schemas data tables that it is going to synchronize too, based on the local SQL Server scope. Here is what the code looks like:
// Setup SQL Database for sync
SqlSyncScopeProvisioning sqlAzureProv =
new SqlSyncScopeProvisioning(sqlAzureConn, myScope);
if (!sqlAzureProv.ScopeExists(scopeName))
// Apply the scope provisioning.
sqlAzureProv.Apply();
To synchronize the databases just run the console application like this:
SyncConsole.exe –setup
Database setup just needs to happen once, however you will might want to synchronize the database multiple, because of this the code is split into two different sections one for setup and one for synchronization.
The code synchronizing the data is just as simple. Here is what it looks like:
using (SqlConnection sqlServerConn = new SqlConnection(LocalSQLServerConnectionString))
{
using (SqlConnection sqlAzureConn = new SqlConnection(RemoteSQLAzureConnectionString))
{
SyncOrchestrator syncOrchestrator = new SyncOrchestrator
{
LocalProvider = new SqlSyncProvider(scopeName, sqlAzureConn),
RemoteProvider = new SqlSyncProvider(scopeName, sqlServerConn),
Direction = SyncDirectionOrder.UploadAndDownload
};
syncOrchestrator.Synchronize();
}
}
In the synchronization code we create two connection and instantiate a sync orchestrator, telling it that we want to upload and download the data. This is considered bi-directional synchronization, writes in either SQL Database or SQL Server to be moved to the other.
To synchronize the databases just run the console application like this:
SyncConsole.exe –sync
Once the synchronization has completed, we can query the SQL Database and see that the data in is there.
To see a full example of how to do it, please visit this article.
I have an existing app built with appcelerator's titanium. It currently has a database existing on it, but I want to upgrade it to an encrypted database. I am using appcelerator's encrypted database module. If I delete the local database and build the app clean it is fine, the problem is when I try to migrate from the existing database to the encrypted one. I get this error:
file is encrypted or is not a database
message = "Couldn't open database and migrate";
Script Error Module "alloy/models/table.js" failed to leave a valid exports object
My exports definition looks like this:
'config': {
columns: columns,
defaults: {},
adapter: {
"type": "enc.db",
"collection_name": "table",
"idAttribute": "LocalID",
"db_name": "_alloy_.enc.db"
}
},
I am using MVC4 with the default membership provider and using entity framework code first. The membership data is in a database stored in my app_data folder called aspnet-xxxxxx.mdf, and my application data is created in a separate database running on SQL Server express.
What do I have to do so that I have 1 database containing my membership tables and the application data? I do not mind losing the data in either of these databases as at the moment it is only test data.
Find your InitializeSimpleMembershipAttribute.cs and change in the line
WebSecurity.InitializeDatabaseConnection("[ConnectionString]", builder.Provider, "UserProfile", "UserId", "UserName", autoCreateTables: false);
[ConnectionString] to your connection string name you defined in Web.config. Or provide a complete connection string