Large Number Entry in Oracle APEX - database

I have a Number field in DB and Oracle APEX.
My Issue is:
If Users want to entry the number data with this format "1.000.000,01", then takes Charakter Error that the entry must be Number.
How can I solve this problem in Application Layer ? In Database Layer there are some solutions , but in Application Layer so far I can not find any solution.
As Summary: I want to entry number as 1.000.000,12 in Application and I want to see it in the same format.
NOT: A procedure runs in the Application to insert the data in DB.

You can/should set appropriate format mask, e.g. 999G999G990D00 where
G represents thousands character (dot in your case)
D represents decimal character (comma in your case)
But, where do you set NLS numeric characters (represented by G and D)? In Apex 20.2, it is to be set in:
application builder
shared components
globalization attributes
security
initialization PL/SQL code - in here, you'll probably see what they are set to. Change those values, if necessary. For example:
begin
execute immediate q'[alter session set nls_numeric_characters = ',.']';
execute immediate q'[alter session set nls_date_format = 'dd.mm.yyyy hh24:mi:ss']';
end;

Related

Delphi stream to/from database with FireDAC

Originaly I want to save/retrieve report from fastreports which uses SaveToStream/LoadToStream for this purpose. I use RAD Studio XE6(upd1).
In database I have table Reports with index field 'StVn'type int and field 'Definition' type ntext. Database is MSSQL and for saving report I use:
FDCommand.CommandText.Text:='UPDATE Reports SET Definition= :pDefinition WHERE StVn=1';
FDCommand.Params.ParamByName('pDefinition').LoadFromStream(MyStream, ftWidememo);
FDCommand.Execute;
and for retrieving:
FDQuery.SQL.Text:='SELECT * FROM Reports WHERE StVn=1';
FDQuery.Open();
MyStream:=FDQuery.CreateBlobStream(FDQuery.FieldByName('Definition'),bmRead);
This worked for some short reports, but for any real one saving/restoring corrupts report definition.
So I make an test case on new form with just an Memo and tried to save/restore it with same data acess setup (FDConnection, FDCommand, FDQuery) and following code:
procedure TForm1.BMemoSaveClick(Sender: TObject);
var TmpStream:TStream;
begin
TmpStream:=TMemoryStream.Create;
Memo1.Lines.SaveToStream(TmpStream);
ShowMessage(IntToStr(TmpStream.Size));
FDCommand1.Params.Clear;
FDCommand1.CommandText.Text:='UPDATE Reports SET Definition= :pDefinition WHERE StVn=1';
FDCommand1.Params.ParamByName('pDefinition').LoadFromStream(TmpStream,ftWideMemo);
FDCommand1.Execute();
TmpStream.Free;
end;
procedure TForm1.BMemoLoadClick(Sender: TObject);
var TmpStream:TStream;
begin
FDQuery.SQL.Text:='SELECT * FROM Reports WHERE StVn=1';
FDQuery.Open();
TmpStream:=FDQuery.CreateBlobStream(FDQuery.FieldByName('Definition'),bmRead);
ShowMessage(IntToStr(TmpStream.Size));
Memo1.Lines.LoadFromStream(TmpStream);
TmpStream.Free;
end;
As you can see I have inserted ShowMessage to see the stream size at saving and at retrieving and if I save just default text 'Memo1' I get length of 7 at saving and length of 14 at loading the memo (it is allways doubled).
Any ideas what I am doing wrong ?
Note, I have not verified the database saving/loading as I don't have MSSQL, but I'm pretty sure this is the cause:
By default, TString uses the default encoding (TEncoding.Default), which is most likely ANSI (in my case Windows-1252), hence the length for the memo text showing as 7 bytes: 5 for "Memo1" and two for the CRLF.
However, your column is of type NTEXT which stores text as UTF-16. When you read it back you do so as a blob and FireDAC does not perform any character conversion1 then, hence the doubling in size.
I would suggest you treat the report as binary data and store it as such using an "image" type column and use ftBlob instead of ftWideMemo.

Access linked tables truncating my Decimal values from the SQL server

Since migrating the Access data to a SQL server I am having multiple problems with the decimal values. In my SQL tables on the SQL 2012 server I am using the Decimal data type for multiple fields. A while a go I first tried to set the decimal values to 18,2 but Access acted weird on this by truncating all the values (55,55 became 50 and so on).
So after multiple changes it seemed that Access accepted the 30,2 decimal setting in the SQL server (now the values were linked correct in the linked Access tables).
A few days ago I stumbled however back on this problem because a user had problems with editing a number in the access form. So I checked the linked table data type and there it seemed that Access converts the decimal 30,2 value to a Short Text data type, which is obviously wrong. So I did a bit of research and found out that Access cannot handle a 30,2 decimal, thus it is converted to text by the ODBC driver. (See my previously post: Access 2013 form field value gets cut off on changing the number before the point)
So to fix this latter error I tried, once again (forgetting that I already messed around with it) to change the decimal value to 17,2 / 18,2 and some other decimal values but on all these changes I am getting back to the truncating problem...
I found some posts about it but nothing concrete or answers on how to solve it.
Some additional information:
Using a SQL 2012 server
Using Access 2013
Got a SQL Server Native Client 10 and 11 installed.
Looking in the register key I found out that I am using ODBC driver version 02.50
The SQL native client 11 has/uses DriverODBC ver 03.80 and the native client 10 uses DriverODBC ver 10.00 (not sure this is relevant though).
UPDATE WITH IMAGES
In a access form I have multiple lines that have a linked table (sql table) as record source. These lines get populated with the data in the SQL server.
Below you can see a line with a specific example, the eenh. prijs is loaded from the linked (SQL) table.
Now when I change the 5 in front of the point (so making it 2555,00 instead of 5555,00) the value gets cut off:
======>>>
So I did research on it and understand that my SQL decimal 30,2 isn't accepted by Access. So I looked in my access linked table to see what kind of data type the field is:
So the specific column (CorStukPrijs) is in the SQL server a decimal 30,2 but here a short text (sorry for the dutch words).
The other numerics (which are OK) are just normal integers by the way.
In my linked table on access - datasheet view the values look like this:
I also added a decimal value of how it looks in my linked table:
In my SQL server the (same) data looks like this:
Though, because of the changing number problem before the point (back in the form - first images) I changed the decimal type of 30,2 in the server to 18,2.
This is the result in the linked table on that same 5555 value:
It gives #Errors and the error message:
Scaling of decimal values has resulted in truncated values
(translated it so wont be probably exactly like that in English)
The previous 0,71 value results with the decimal 18,2 in:
Hope its a bit clearer now!
P.S. I just changed one decimal field to 18,2 now.
Recently I found a solution for this problem! It all had to do with language settings after all.. (and the decimal 30,2 which is not accepted as a decimal in Access 2013).
I changed the Native client from 10 to 11 and in my connection string I added one vital value: regional=no. This fixed the problem!
So now my connection string is:
szSQLConnectionString = "DRIVER=SQL Server Native Client 11.0;SERVER=" & szSQLServer & ";DATABASE=" & szSQLDatabase & ";UID=" & szSQLUsername & ";PWD=" & szSQLPassword & ";regional=no;Application Name=OPS-FE;MARS_Connection=yes;"
A few things:
No real good reason to try a decimal value of 30 digits?
Access only supports 28 digits for a packed decimal column. So going to 30 will force Access to see that value as a string.
If you keep the total digits below 28, then you should be ok.
You also left out what driver you are using. (legacy, or native 10 or native 11). However, all 3 should have no trouble with decimal.
As a few noted here, after ANY change to the sql table, you have to refresh the linked table else such changes will not show up.
There is NO need to have some re-link code every time on startup. And it not clear how your re-link code works. If the re-link code makes a copy of the tabledef object, and then re-instates the same tabledef then changes to the back end may well not show up.
I would suggest during testing, you DO NOT use your re-link routines, but simply right click on the given linked table and choose the linked table manager. Then click on the one table, and ok to refresh.
Also, in Access during this testing, dump (remove) any formatting you have in the table settings for testing (the format setting).
I suggest you start over, and take the original tables and re-up-size them again.
Access should and can handle the decimal types with ease, but it not clear what your original settings were. If the values never require more than 4 significant digits beyond the decimal, then I would consider using currency, but decimal should also work.

how to increase the sample size used during schema discovery to 'unlimited'?

I have encountered some errors with the SDP where one of the potential fixes is to increase the sample size used during schema discovery to 'unlimited'.
For more information on these errors, see:
No matched schema for {"_id":"...","doc":{...}
The value type for json field XXXX was presented as YYYY but the discovered data type of the table's column was ZZZZ
XXXX does not exist in the discovered schema. Document has not been imported
Question:
How can I set the sample size? After I have set the sample size, do I need to trigger a rescan?
These are the steps you can follow to change the sample size. Beware that a larger sample size will increase the runtime for the algorithm and there is no indication in the dashboard other than the job remaining in 'triggered' state for a while.
Verify the specific load has been stopped and the dashboard status shows it as stopped (with or without error)
Find a document https://<account>.cloudant.com/_warehouser/<source> where <source> matches the name of the Cloudant database you have issues with
Note: Check https://<account>.cloudant.com/_warehouser/_all_docs if the document id is not obvious
Substitute "sample_size": null (which scans a sample of 10,000 random documents) with "sample_size": -1 (to scan all documents in your database) or "sample_size": X (to scan X documents in your database where X is a positive integer)
Save the document and trigger a rescan in the dashboard. A new schema discovery run will execute using the defined sample size.

SSIS - How to convert real values for Oracle?

I'm facing a problem in a package to import some data from a MySQL table to Oracle table and MS SQL Server table. It works well from MySQL to SQL Server, however I get an error when I want to import to Oracle.
The table I want to import contains an attribute (unitPrice) of data type DT_R8.
The destination data type for Oracle is a DT_NUMBERIC as you can see in the capture.
I added a conversion step to convert the unitPrice data from DT_R8 to DT_NUMERIC.
It doesn't work, I get the following error.
I found the detail of the error :
An ORA-01722 ("invalid number") error occurs when an attempt is made to convert a character string into a number, and the string cannot be converted into a valid number. Valid numbers contain the digits '0' through '9', with possibly one decimal point, a sign (+ or -) at the beginning or end of the string, or an 'E' or 'e' (if it is a floating point number in scientific notation). All other characters are forbidden.
However, I don't know how to fix.
EDIT : I added a component to redirect rows/errors to an Excel file.
The following screenshot show the result of the process including errors :
By browsing the only 3000 rows recorded, It seems the process accept only int values no real. So if the price is equal to 10, it's OK but if it's 10,5 it's failed.
Any idea to solve this issue ?
Your NLS environment does not match the expected one. Default, Oracle assumes that "," is the grouping character and "." is the decimal separator. Make sure that your session uses the correct value for the NLS_NUMERIC_CHARACTERS parameter.
See Setting Up a Globalization Support Environment for docu.

SAS Create permanent format from permanent dataset

I have a permanent data set called Branch(Branch code, Branch description)
I want to create a format from that dataset (a permanent one)
I can see that this gives me more or less what I want, but now to put it into a permanent dataset?
proc format library = Home.Branch fmtlib;
Run;
What I've tried
proc print data=Home.DataSetToApply
format B_Code $B_CODE_FORMAT.;
RUN;
This works if I manually create the format. I can't seem to create a permanent format directly from a data set.
Could you point me in the right direction?
Resources
Creating a Format from Raw Data or a SASĀ® Dataset
SAS has an autoexec.sas file which executes when you start SAS.
Of course, whether this is a valid option depends on your access rights + the OS you're running.
Have a look here: http://support.sas.com/documentation/cdl/en/hostwin/63285/HTML/default/viewer.htm#win-sysop-autoexec.htm
You could just drop the format code in the auto-executing script then to have your format always available when using SAS.
This will create a dataset with formats in the current library.
proc format cntlout=myfmtdataset lib=mylibname;
select myformatname; *if you want to just pick one or some - leave out select for all;
quit;
This will import that back into formats (later):
proc format cntiln=myfmtdataset lib=myotherlibname;
quit;
That could of course be in your autoexec, or in your regular code.
If you are trying to take a dataset to make a permanent format, you need to set it up like this:
Required:
fmtname = name of format start = starting value (or, single value)
end = ending value (this can be missing if only single values)
label = formatted value
Optional:
type = type of format (n=numeric, c=character, i=informat, j=character informat)
hlo = various options (h=end is highest value, l = start is lowest value,
o=other, m=multilabel, etc.)
Then use the CNTLIN option to load it. SAS documentation has more detail if you need it.

Resources