So when I insert data in from the SQL Shell, it's fine. However, when I try do it the same way using the GUI (view/edit data function), whenever I enter the city (character - size 50), only the 1st letter shows, but not the complete city name. When I insert the data from SQL shell, it works perfectly. But when I try to insert the data using the view/edit data function, only the first letter appears. Does anyone know why that happens?
This was a bug in pgAdmin4 and recently got fixed, Fix will be available in next public release of pgAdmin4.
Reference: Link
Related
(Opening the following on behalf of a Snowflake client...)
When I try to insert into the table it threw below error:
Numeric value 'abc_0011O00001y31VpQAI' is not recognized
Have check the table DDL and found only 3 columns defined as NUMBER and rest as VARCHAR.
I checked the SELECT query and didnot find any string value in those NUMBER Datatype columns. Also tried searching in all the Varchar columns for the value 'abc_0011O00001y31VpQAI' , I didn't find any
I know one thing Snowflake doesn't always shows correct error. Am I missing anything here? Is there any way to fix it?
Both COL4_MRR and COL5_QUANTITY are NUMBER
INSERT INTO TABLE
(COL1_DATE, COL2_CD, COL3_CUST_NAME, COL3_LOC_NAME,COL4_MRR,COL5_QUANTITY)
SELECT
'2019-10-03' AS COL1_DATE ,
'AE' AS COL2_CD
,CUSTOMER_NAME AS COL3_CUST_NAME
,LOCATION_NAME AS COL3_LOC_NAME
,MRR_BILLED as COL4_MRR
,QTY_BILLED as COL5_QUANTITY
FROM SCHEMA.V_TABLEA
union all
SELECT
'2019-10-03' AS COL1_DATE ,
'BE' AS COL2_CD
,CUSTOMER_NAME AS COL3_CUST_NAME
,LOCATION_NAME AS COL3_LOC_NAME
,NULL as COL4_MRR
,QTY_BILLED as COL5_QUANTITY
FROM SCHEMA.V_TABLEB
I created a table_D same as original TABLE and tried inserting into it , it worked fine . Then Inserted into Original TABLE from table_D , it worked again .
Deleted those rows from original TABLE and reran the job , it worked fine.
There was no issue with data as all was Number only, I even tried with TRY_TO_NUMBER too. It inserted the data without any changes to the code.
...............
Client is currently waiting on a next day run to re-test to determine if this is either a bug or an issue with their data. In the meantime, we are interested to see if anyone else has run into similar challenges and have a viable recommendation. THANK YOU.
The error typically means you are trying to insert non-numeric data (like 'abc_0011O00001y31VpQAI') into a numeric column. It seems like the customer did everything right in testing and TRY_TO_NUMBER() is a great way to verify numeric data.
Do the SELECT queries run fine separately? If so, then I would check whether there might be a potential mismatch in the datatype of the columns and make sure they are in the right order.
I would also check whether or not the header is being skipped in the file (that may be where the 'abc_0011O00001y31VpQAI' is coming from since the customer did not see it in the data).
SELECT queries work fine, I tried creating a new table with same DDL as original and tried loading into that new table, it worked fine. Not sure why it is not loading into the original table
I use UPDATE a SET GR_P = REPLACE(GR_P,'','') FROM mytable a to replace things.
But replace function is not working for below charter:
In Query analyzer it works but when I used SSIS Execute SQL task or OLEDB Source then it is giving me error:
No Connection manager is specified.
In Toad against Oracle (since that's one of your tags), I issued this (pressing ALT-12 to get the female symbol) and got 191 as a result. note selecting it back using CHR(191) shows an upside-down question mark though.
select ascii('♀') from dual;
Given that, this worked but it's Oracle syntax, your mileage may vary.
UPDATE mytable SET GR_P = REPLACE(GR_P, CHR(191));
Note if it does not work, that symbol could be for another control character. You may need to use a regular expression to eliminate all characters not in a-zA-Z0-9, etc. I suspect you'll need to update your tags to get a more accurate answer.
Maybe this info will help anyway. Please post back what you find out.
I have created a local table and inserted some data. As long as I don't use is secure for fields all works fine. But when I use is secure for some fields I only get encrypted text for these fields. It seems that a decrypt function is missing. How to solve this?
For accessing the fields I use:
var field_value = Data.execute("select f3 from Testtable where rowid = 3;");
alert("Row 3:" + field_value);
Actually, you should get encrypted values if you try this kind of sql execute statement, regardless from how many rows are selected as isSecure.
I mean, if there is an isSecure field, it should always be seen as encrypted. Selecting some rows as inSecure or all of them as inSecure, shouldn't change anything.
The new release of Smartface App Studio is available right now.
(http://account.smartface.io/Account/Login?ReturnUrl=%2F )
I tested your case, it works fine with this new release.
By the way, if you want to reach the actual value in table, try the below code :
Data.myDataset.move(2); //Lets reach the same row you wrote above with move
alert("Row 3 : " + Data.myDataset.f3);
You should write your dataset's name instead of myDataset.
I am trying to seed my database using a Database Project in Visual Studio 2012. As park of my Post Deployment script I have a statement similar to the following:
INSERT INTO SomeTable (SomeTextCol) Values (N'$(function(){});')
The column is defined as NVARCHAR(MAX)
This causes the post build event to fail. However the strange thing is, if I execute the statement using SSMS it succeeds.
Is this caused by the fact the Database Project is using SQLCMD in the background?
How can I fix this error (whilst still being able to insert the jQuery into the table) - changing the column value isnt an option for me, as I do not own the schema.
You are right, this is caused by sqlcmd for which $ is a special character.
One solution to fix this would be to concatenate two strings:
INSERT INTO SomeTable (SomeTextCol) Values (N'$'+'(function(){});');
It is a little ugly, but it works. An alternative would be to use a different symbol for jQuery like its complete name:
INSERT INTO SomeTable (SomeTextCol) Values (N'jQuery(function(){});');
Or, if you have a lot of references to jQuery inside the function:
(function(JQ){
JQ(
function(){
//other references to JQ
}
);
}
)(jQuery);
This last solution has the added advantage that anyone defining $ or JQ anywhere else on the page won't break your code, as "your" JQ is defined in a closure visible only to your code.
I'm doing an Excel loop through fifty or more Excel files. The loop goes through each Excel file, grabs all the data and inputs it into the database without error. This is the typical process of setting delay validation to true, and making sure that the expression for the Excel Connection is a string variable called EFile that is set to nothing (in the loop).
What is not working: trying to input the name of the Excel file into the database.
What's been tried (edit; SO changed my 2 to 1 - don't know why):
Add a derived column between the Excel file and database input, and add a column using the EFile expression (so under Expression in the Derived Column it would be #[User::EFile]). and add the empty. However, this inputs nothing a blank (nothing).
One suggestion was to add ANOTHER string variable and set its properties EvaluateAsExpression to True and set the Expression to the EFile variable (#[User::EFile]). The funny thing is that this does the same thing - inputs a blank into the database.
Numerous people on blogs claim they can do this, yet I haven't seen one actually address this (I have a blog and I will definitely be showing people how to do this when I get an answer because, so far, these others have fallen short). How do I grab an Excel file's name and input it in a database during a loop?
Added: Forgot to add, no scripts; the claim is that it can be done without them, so I want to see the solution without them.
Note: I already have the ability to import the data from the Excel files - that's easy (see my GitHub account, as I have two different projects for importing all sorts of txt, csv, xls, xlsx data). I am trying to also get the actual name of the file being imported also into the database. So, if there are fifty Excel files, along with the data in each file, the database will have the fifty file names alongside that data (so if each file has 1000 rows of data, each 1000 rows would also have the name of the file they came from next to them as an additional column). This point seems to cause a lot of confusion, as people assume I'm having trouble importing data in files - NOPE, see my GitHub; again that's easy. It's the FILENAME that needs to also be imported.
Test package: https://github.com/tmmtsmith/SSISLoopWithFileName
Solution: #jaimet pointed out that the Derived Column needed to be the #[User::CurrentFile] (see the test package). When I first ran the package, I still got a blank value in my database. But when we originally set up the connection, we do point it to an actual file (I call this "fooling the package"), then change the expression on the connecting later to the #[User::CurrentFile], which is blank. The Derived Column, using the variable #[User::CurrentFile], showed a string of 0. So, I removed the Derived Column, put the full file path and name in the variable, then added the variable to the Derived Column (which made it think the string was 91 characters long), then went back and set the variable to nothing (English teacher would hate the THENs about right now). When I ran the package, it inputted the full file path. Maybe, like the connection, it needs to initially think that a file exists in order for it to input the full amount of characters?
Appreciate all the help.
The issue is because of blank value in the variable #[User::FileNameInput] and this caused the SSIS package to assume that the value of this variable will always be of zero length in the Derived Column transformation.
Change the expression on the Derived column transformation from #[User::FileNameInput] to (DT_STR, 2000, 1252)#[User::FileNameInput].
Type casting the derived column to 2000 sets the column length to that maximum value. The value 1252 represents the code page. I assumed that you are using ANSI code page. I took the value 2000 from your table definition because the FilePath column had variable VARCHAR(2000). If the column data type had been NVARCHAR(2000), then the expression would be (DT_WSTR, 2000)#[User::FileNameInput]
Tim,
You're using the wrong variable in your Derived Column component. You are storing the filename in #[User::CurrentFile] but the variable that you're using in your Derived Column component is #[User::FileNameInput]
Change your Derived Column component to use #[User::CurrentFile] and you'll be good.
Hope that helps.
JT
If you are using a ForEach loop to process the files in a folder then I have have used the technique described in SSIS Junkie's blog to get the filename in to an SSIS variable: SSIS: Enumerating files in a Foreach loop
You can use the variable later in your flow to write it to the database.
TO all intents and purposes your method #1 should work. That's exactly how I would attempt to do it. I am baffled as to why it is not working. Could you perhaps share your package?
Tony, thanks very much for the link. Much appreciated.
Regards
Jamie