I have more than 2000 shapefiles stored in PostGIS database which are also stored in my geoserver store. I am able to show all the layers in the layer tab by connecting to the store and able to publish individual layers one by one. But I want to publish all the layers from this store. Is this possible to publish all the layers at eh same time? Is there any method?
The GeoServer importer extension was designed to support mass layer configurations from a single store, check it out: https://docs.geoserver.org/latest/en/user/extensions/importer/index.html
Check out this tutorial. It has explained the procedure using cURL
http://benardonyango.com/2019/11/19/publishing-multiple-layers-of-geoserver-using-script/
Related
I would like to know if this scenario would be possible in any programming language combined with any database technology.
I would like to automatically save received pdf files that are attached in emails into a database. Is this possible? Is there any library or framework available to do so?
Yes, I would recommend using Google Apps Script for this. The approach you should follow is to use the GmailApp class (Documentation here) to get the messages you need, you can use methods like getInboxThreads() (Documentation), to retrieve the messages.
After you've found the message and retrieved the attachment (which you can do withgetAttachments() (Documentation)), you can use the JDBC Service to connect with external databases. The specifics here depend a lot on what database you want to connect with, but the documentation will lead you in the right direction.
I have a spatial database in Sql server and I am working in asp.net mvc.
My project must show stored shape in web page that uses openlayers js and user must see and edit shapes and send the result to server.
I want to know how can I send data to my map and receive new data from it and save data to database.
Is there any library that work for me?
Is is not important that it must be open source
thanks a lot
There are (at least) two solutions available to you:
either you use openLayers "save" strategy, that you attach to your WFS vector layer, where you draw features and it saves your features for you with the help of your mapServer
or you save your features (which I prefer) yourself. Just serialiaze your VectorLayer.features feature collection, translate geometries using OpenLayers.Format.WKT parser to get the WKT of your geometries and then use simple INSERT, UPDATE to put your WKT into your database and convert it back to geometry with the help of a spatial function. (you didn't mention the type of your SQL server, so I put here an MS SQL example string in C#
string insertString = "INSERT INTO myTable(geometry) VALUES(geometry::STGeomFromText('" + objFeature.WKTString + #"',5514)";
As I know, Sharepoint save all users list in one table. I have several sharepoint lists. And I want to store Data from Sharepoint lists in custom MS Sql Server DB. That difrent Sharepoint lists store data in diffrent tables. I want that this data is stored only in my custom DB (not in sharepoint DB).
And I also want that mutual (many-to-many) links between difrent lists in this DB are. For example I have 2 lists Projects and Emploeyrs one project can have many employers and one employer can work on several projects. I want that if I delete emploer from project link for that project is deleted from this emploer.
Could You recomend me some sollutions for this task?
I think I know what your trying to do :\
You might want to look at this http://www.simego.com/Products/Data-Synchronisation-Studio and use dynamic columns
Sounds like a real mashup, I'd bee using some external components like the ASPxGridView from DevExpress, http://www.devexpress.com/Products/NET/Controls/ASP/Grid/, to get the list views since you wont be able to use the internal lists.
To interface towards the internal SharePoint lists I'd use the Camelot .NET Connector from Bendsoft, http://www.bendsoft.com/net-sharepoint-connector/.
With that combination it wont really matter where you put the result, it can be used internally in SharePoint as well as externally and it dont matter if you use 2007 or newer either.
I have a Django project with multiple apps. They all share a db with engine = django.db.backends.postgresql_psycopg2. Now I want some functionality of GeoDjango and decided I want to integrate it into my existing project. I read through the tutorial, and it looks like I have to create a separate spartial database for GeoDjango. I wonder if there is anyway around. I tried to add this into one of my apps' models.py without changing my db settings :
from django.contrib.gis.db.models import PointField
class Location(models.Model):
location = PointField()
But when I run syncdb, I got this error.
File "/home/virtual/virtual-env/lib/python2.7/site-packages/django/contrib/gis/db/models/fields.py", line 200, in db_type
return connection.ops.geo_db_type(self)
Actually, as i recall, django.contrib.gis.db.backends.postgis is extension of postgresql_psycopg2 so you could change db driver in settings, create new db with spatial template and then migrate data to new db (South is great for this). By itself geodjango is highly dependent on DB inner methods thus, unfortunately, you couldn't use it with regular db.
Other way - you could make use of django's multi-db ability, and create extra db for geodjango models.
Your error looks like it comes from not changing the database extension in your settings file. You don't technically need to create a new database using the spatial template, you can simply run the PostGIS scripts on your existing database to get all of the geospatial goodies. As always, you should backup your existing database before doing this though.
I'm not 100%, but I think that you can pipe postgis.sql and spatial_ref_sys.sql into your existing database, grant permissions to the tables, and change the db setting to "django.contrib.gis.db.backends.postgis". (After you have installed the deps of course)
https://docs.djangoproject.com/en/dev/ref/contrib/gis/install/#spatialdb-template
I'd be interested to see what you find. Be careful, postgis installation can build some character but you don't want it to build too much.
From the docs (django 3.1) https://docs.djangoproject.com/en/3.1/ref/databases/#migration-operation-for-adding-extensions :
If you need to add a PostgreSQL extension (like hstore, postgis, etc.) using a migration, use the CreateExtension operation.
I want to create a a application in actionscipt 3.0 that allows the user to listen to music and read descriptions of the music. For this to happen i suppose there should be a database where the textbits and music is located and then flash fetch the info when the correct buttons are pushed. The database will contain up to 100 tracks and textbits.
The application will function on a stand that won't have a connection to the internet.
What is the easiest way to do this in actionscript 3.0?
If any of you are familiar with UML and thinks this might help in understanding the problem, then here is use-case and flow-chart:
alt text http://img135.imageshack.us/img135/1498/flowchart2.jpg
alt text http://img27.imageshack.us/img27/1000/usercase.jpg
Thanks in advance.
The easiest way to do what you're asking is probably to store the files in a directory on the machine the application is going to be running on, and then design an XML structure for storing your data. The XML is easily loaded in to Flash at runtime and is easily edittable.
Your other option would be running a database server on the machine, creating web services that run locally and push/pull the data from the database, and then call those services from your Flash application.
The first option is most definitely the easiest and should be able to provide exactly what you need. The second would be more geared towards a distributed Flash application where you needed a central data repository for the clients.
If you're building an AIR application, you can use the integrated SQLITE database. But, i agree with Justin, the easiest way is to use a XML file.
You can probably consider using "Local Shared Objects" which is a kind of cookie, with bigger capacity (100Kb by default, but you can change it). Compared to other solutions already proposed, it has then advantage of not requiring any web server.