Fill array using odbc adapter with out loop - arrays

Please help me to load array value from database using odbc adapter with out using loop statement.
Thanks,
S.Somu

Since you didn't specify a language, here's an example in Python using mxODBC:
import mx.ODBC.unixODBC as mx
db = mx.DriverConnect('DSN=[your DSN here]')
c = db.cursor()
results = c.execute("SELECT * from tableName").fetchall()
If you have more detail - such as a specific language, you might get an answer closer to what you want.

Related

Lua: Connect with MSSQL DB

I'm learning about Lua (version 5.3),Is there a way to connect to an mssql database?
I try to do it as follows
print("Connection to the db:")
local dbuser = '....'
local dbpass = '...'
local dbsource = '....'
local dbname = 'xx.xx.xxx.xx'
database.open("Provider= ODBC ; Initial Catalog = dbname; Data Source = dbsource ; User ID = dbuser ; Password = dbpass ")
print("Database opened succesfully.")
print("Connection to the db:")
But didn´t work, execution returns me:
attempt to index a nil value (global 'database')
Thanks,
I try to do it as follows
database.open("Provider= ODBC ; Initial Catalog = dbname; Data Source = dbsource ; User ID = dbuser ; Password = dbpass ")
What made you think you can do that? You get that error because there is no global variable database unless you define one.
Hence you may not index it. Trial and error with random code is not a very good way to get things done in programming.
Also you cannot handle strings like that in Lua. You either need to use string.format or the concatenation operator .. to get your variables into that string.
There's also no printf in Lua unless you define it.
I suggest you learn the basics of Lua befor you get into interfacing databases.
From reading the Lua reference manual it should become obvious that Lua does not know anything about databases.
Either you run Lua embedded into some host application that provides database access through its Lua API or you need to load a library that supports that.
Searching the web for "lua sql" instantly yields
https://keplerproject.github.io/luasql/
and others.

PostGIS invalid GML representation

What is wrong with my geomtery? geometry is here.
When I run this command:
SELECT st_geomfromgml('content of file geometry.xml');
this error was thrown:
ERROR: invalid GML representation
I'm using postgres 9.4.1 and PostGIS 2.1.6
Thank you for help
The documentation for ST_GeomFromGML says that the function does "not support SQL/MM curves geometries". It's unfortunate that the error message does not articulate this shortcoming, as the GML has a valid representation to modern software. There is an enhancement ticket to enable this support, but there has not been any movement on this after several years.
As a workaround, you can use GDAL/OGR from (e.g.) Python to read the GML and export the WKB to PostGIS:
from osgeo import ogr
with open('geometry.xml', 'r') as fp:
g = ogr.CreateGeometryFromGML(fp.read())
g.GetArea() # 4519550457.106098
# There is more than one way to insert this to PostGIS, here is one way
import psycopg2
conn = psycopg2.connect('dbname=postgis host=localhost user=postgres port=5432')
curs = conn.cursor()
curs.execute('CREATE TABLE surf(geom geometry);')
curs.execute('INSERT INTO surf(geom) VALUES (%s)', (g.ExportToWkb().encode('hex'),))
conn.commit()
curs.execute('SELECT ST_Area(geom) FROM surf')
curs.fetchone()[0] # 4519550457.07643
The two area calculations (using different methods) are essentially the same, which is reassuring.

Avoid SQL Injection when using Dynamic SQL Code

I am working on an security remediation of an existing java web application. The application has some dynamic sql code executed by JDBC.But, this is not accepted by Static Code analysis tool we use. So, I am looking for a way to remediate the issue.Basically, I have validated all the input passed to code which constructs the query , so there is no possiblity of SQL Injection. But, the SCA tool still does not approve of this validation. So, want to know if there is any way I can avoid Dynamic Query logic. Prepared Statements cannot be used as the query is dynamicly constructed based on conditions.
I know Stored Procedure can help. But, I understand it has its own issues and the team is also not experienced on Stored Procedures. So, looking for a better way to address this issue. Also, since we are using SQL Server I didn't find any encoding function in the ESAPI toolkit to sanitize the query parameters which has support for oracle and mysql only.
Want to know if using a framework like Mybatis to offload the java code which constructs sql to xml files would resolve the issue. Can you guys let me know if there is any other better way.
You can generate SQL dynamically and use prepared statements.
Here is the idea how this can be done.
Now you have code like this:
StringBuilder whereClause = new StringBuilder();
if (name != null) {
whereClause.append(String.format("name = '%s'", name));
}
// other similar conditions
String sql = "select * from table" + (whereClause.length() != 0 ? "where " + whereClause.toString() : "");
Statement stmt = connection.createStatement();
ResultSet rs = stmt.executeQuery(sql);
// use rs to fetch data
And you need to change this to something like
StringBuilder whereClause = new StringBuilder();
ArrayList<Object> parameters = new ArrayList<>();
if (name != null) {
whereClause.append("name = ?");
parameters.add(name);
}
// other similar conditions
String sql = "select * from table" + (whereClause.length() != 0 ? "where " + whereClause.toString() : "");
PreparedStatement stmt = connection.prepareStatement();
for (int i = 0; i < parameters.length(); ++i) {
setParameterValue(stmt, i + 1, parameter.get(i));
}
ResultSet rs = stmt.executeQuery(sql);
// use rs to fetch data
setParameterValue should look like this:
void setParameterValue(PreparedStatement ps, int index, Object value) {
if (value instanceof String) {
ps.setString(index, (String)value);
} if (value instanceof Integer) {
ps.setInt(index, (Integer)value);
} // and more boilerplate code like this for all types you need
}
With mybatis you can avoid writing such boilerplate code do generate dynamic sql and make this much easier. But I don't know how CSA treats mybatis generated SQL.
I've found this question while trying to solve similar problem myself.
First, we may factor sql code out of java files and store it in text file under resources folder. Then, from java code, use classloader's method to read sql as inputStream and convert it to String. Storing sql code in separate files will enable statical code analysis.
Second, we can use named parameters in sql in some form that is easily recognizable via regular expressions. E.g. ${namedParam} syntax which is familiar by different expression languages. Then we can write helper method to take this parametrised sql and Map<String, Object> with query params. Keys in this map should correspond to sql parameter names. This helper method would produce PreparedStatement with set parameters. Using named parameters will make sql code more readable and will save us some debugging.
Third, at last, we can use sql comments to mark parts of sql code as dependant on presence of some parameter. And use it in the previously described helper method to include in the resulting Statement only parts, for which entries in parameters Map exist. E.g.: /*${namedParam}[*/ some sql code /*]${namedParam}*/. This would be an unobtrusive way to insert conditions into our dynamic sql.
Following DRY principle, we could also try to employ some existing expression language engine, but it would get us one more dependency and processing expense.
I will post the solution here once I get working code.

Can I access sqlite3 using octave?

Is there a way to read and write to sqlite3 from octave?
I'm thinking something along the lines of RODBC in R or the sqlite3 package in python, but for octave.
I looked on octave-forge http://octave.sourceforge.net/packages.php
But could only find the 'database' package, which only supports postgresql.
Details:
OS: Ubuntu 12.04
Octave: 3.6.2
sqlite: 3.7.9
I realise this is an old question, but most answers here seem to miss the point, focusing on whether there exists a bespoke octave package providing a formal interface, rather than whether it is possible to perform sqlite3 queries from within octave at all in the first place.
Therefore I thought I'd provide a practical answer for anyone simply trying to access sqlite3 via octave; it is in fact trivial to do so, I have done so myself many times.
Simply do an appropriate system call to the sqlite3 command (obviously this implies you have an sqlite3 client installed on your system). I find the most convenient way to do so is to use the
sqlite3 database.sqlite < FileContainingQuery > OutputToFile
syntax for calling sqlite3.
Any sqlite3 commands modifying output can be passed together with the query to obtain the output in the desired format.
E.g. here's a toy example plotting a frequency chart from a table which returns appropriate scores and counts in csv format (with headers and runtime stats stripped from the output).
pkg load io % required for csv2cell (used to collect results)
% Define database and Query
Database = '/absolute/path/to/database.sqlite';
Query = strcat(
% Options to sqlite3 modifying output format:
".timer off \n", % Prevents runtime stats printed at end of query
".headers off \n", % If you just want the output without headers
".mode csv \n", % Export as csv; use csv2cell to collect results
% actual query
"SELECT Scores, Counts \n",
"FROM Data; \n" % (Don't forget the semicolon!)
);
% Create temporary files to hold query and results
QueryFile = tempname() ; QueryFId = fopen( QueryFile, 'w' );
fprintf( QueryFId, Query ); fclose( QueryFId);
ResultsFile = tempname();
% Run query
Cmd = sprintf( 'sqlite3 "%s" < "%s" > "%s"', Database, QueryFile, ResultsFile );
[Status, Output] = system( Cmd );
% Confirm query succeeded and if so collect Results
% in a cell array and clean up temp files.
if Status != 0, delete( QueryFile, ResultsFile ); error("Query Failed");
else, Results = csv2cell( ResultsFile ); delete( QueryFile, ResultsFile );
end
% Process Results
Results = cell2mat( Results );
Scores = Results(:, 1); Counts = Results(:, 2);
BarChart = bar( Scores, Counts, 0.7 ); % ... etc
Et, voilà
According to Octave-Forge the answer is no.
Interface to SQL databases, currently only postgresql using libpq.
But you can write your own database package using the Octave C++ API with SQLite C API
As you already found out, the new version of the database package (2.0.0) only supports postgreSQL. However, old versions of the package also supported MySQL and SQLite (the last version with them was version 1.0.4).
Its problem is that the old database packages do not work with the new Octave and SWIG versions (I think the last version of Octave where the database package worked was 3.2.4). Aside the lack of maintainer (package was abandoned for almost 4 years) its use of SWIG was becoming a problem since it made more difficult for other developers to step in. Still, some users tried to fix it and some half fixes have been done (but never released). See bug #38098 and Octave's wiki page on the database package for some reports on making it work with SQLite in Octave 3.6.2.
The new version of the package is a complete restart of the package. Would be great if you could contribute with development for SQLite bindings.
Check out this link http://octave.1599824.n4.nabble.com/Octave-and-databases-td2402806.html which asks the same question regarding MySQL.
In particular this reply from Martin Helm points the way to using JDBC to connect to any JDBC supported database -
"Look at the java bindings in the octave java package (octave-forge), it is
maintained and it works. Java is very strong and easy for database handling.
Use that and jdbc driver for mysql to connect to mysql (or with the
appropriate jdbc friver everything else which you can imagine). That is what I
do when using db queries from octave. Much easier and less indirect than
invoking scripts and parsing output from databse queries.
As far as I remeber the database package is somehow broken (at least I never
was able to use it). "
I know this thread is pretty old, but for anybody else out there looking for a similar solution, this project seems to provide it.
https://github.com/markuman/go-sqlite

Import data from Excel to SQL server with substring function in query

I am pretty much following the example on this MSDN page:
http://code.msdn.microsoft.com/Imoprt-Data-from-Excel-to-705ecfcd
with 1 exception. Instead of a simple query like
"Select * FROM [Sheet1$]"
I'd like to do something a little more complicated like:
"Select *, SUBSTRING(COLUMN_A, 1, 5) as STRIPPED_COL_A FROM [Sheet1$]"
but I'm getting a useless Exception message "IErrorInfo.GetDescription failed with E_FAIL(0x80004005)."
If I have to guess, the problem is due to the usage of the unsupported SUBSTRING function in Excel or OleDB. So how to get around the problem.
In their example you are using the Microsoft.ACE.OLEDB driver to load the excel sheet via Provider=Microsoft.ACE.OLEDB.12.0 or Provider=Microsoft.Jet.OLEDB.4.0 so you are going to have to use the query formatting and functionality available within MS Access.
Select *, MID(Column_A, <<start>>, <<length>>) as STRIPPED_COL_A from [Sheet1$]
Keep in mind the other peculiarities such as <<start>> not being ZERO-indexed and instead starting with 1 being the first character in the string.

Resources