I tried to find documentation in the subject but fell short until now.
I am trying to use Logic Apps in order to update a table when a trigger occurs.
Adding some context:
In many separate excel online file that are located in different area of Sharepoint, I have one Table in each of those files. Anytime the SQL table is updated, I get the following elements:
Name
Age
path_to_doc
doc_id
Name and Age are element I wish to add in those Excel file.
path_to_doc is the path to the Excel file that needs to be updated.
doc_id is the id of the Excel file that needs to be updated.
In the "Add row to a table" action, those are the elements that need to be filled:
Site (Manual no problem, this doesn't change) Document Library
(Manual no problem, this doesn't change)
File (this is where I have a first problem: when I do not click
manually, and try to put either the "path_to_doc" or the "doc_id"
instead, it doesn't work.
Table (It seems that I can force it to be Table1), which is fine
because all my Excel files have the table called Table1
Arguments (that is Azure understands the Table and is componnents and
asks you to fill the ones you need to fill, those elements disappear
when you change from a manual input to an input "path_to_doc" or
"doc_id").
It throws me an error:
ERROR 400
NOTE: When I do it manually, it works.
Anyone has experienced this and found a solution?
Thank you
You don't need to use Expression.
For example, if we want to get tables of the modified Excel, we can do like this:
A similar flow in SharePoint:
Finally found the answer.
I needed to go to the code view and add my dynamic details there for the body.
Thank you for your help.
Here is the solution. I hope it helps others :)
In the designer view, create an action "Add a row into a table" and use the dynamic path that brings you to the excel file that you need to update. It will show an error and you will not be able to add the body arguments.
In the code view, now you can manually add the body of the request to include the element you wish to update in the Table of the excel file.
That's it!
Related
I'm testing out a trial version of Snowflake. I created a table and want to load a local CSV called "food" but I don't see any "load" data option as shown in tutorial videos.
What am I missing? Do I need to use a PUT command somewhere?
Don't think Snowsight has that option in the UI. It's available in the classic UI though. Go to Databases tab, select a database. Go to Tables tab and select a table the option will be at the top
If the classic UI is limiting you or you are already using Snowsight and don't want to switch back, then here is another way to upload a CSV file.
A preliminary is that you have installed SnowSQL on your device (https://docs.snowflake.com/en/user-guide/snowsql-install-config.html).
Start SnowSQL and perform the following steps:
Use the database where to upload the file to. You need various privileges for creating a stage, a fileformat, and a table. E.g. USE MY_TEST_DB;
Create the fileformat you want to use for uploading your CSV file. E.g.
CREATE FILE FORMAT "MY_TEST_DB"."PUBLIC".MY_FILE_FORMAT TYPE = 'CSV';
If you don't configure the RECORD_DELIMITER, the FIELD_DELIMITER, and other stuff, Snowflake uses some defaults. I suggest you have a look at https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html. Some of the auto detection stuff can make your life hard and sometimes it is better to disable it.
Create a stage using the previously created fileformat
CREATE STAGE MY_STAGE file_format = "MY_TEST_DB"."PUBLIC".MY_FILE_FORMAT;
Now you can put your file to this stage
PUT file://<file_path>/file.csv #MY_STAGE;
You can find documentation for configuring the stage at https://docs.snowflake.com/en/sql-reference/sql/create-stage.html
You can check the upload with
SELECT d.$1, ..., d.$N FROM #MY_STAGE/file.csv d;
Then, create your table.
CREATE TABLE MY_TABLE (col1 varchar, ..., colN varchar);
Personally, I prefer creating first a table with only varchar columns and then create a view or a table with the final types. I love the try_to_* functions in snowflake (e.g. https://docs.snowflake.com/en/sql-reference/functions/try_to_decimal.html).
Then, copy the content from your stage to your table. If you want to transform your data at this point, you have to use an inner select. If not then the following command is enough.
COPY INTO mycsvtable from #MY_STAGE/file.csv;
I suggest doing this without the inner SELECT because then the option ERROR_ON_COLUMN_COUNT_MISMATCH works.
Be aware that the schema of the table must match the format. As mentioned above, if you go with all columns as varchars first and then transform the columns of interest in a second step, you should be fine.
You can find documentation for copying the staged file into a table at https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html
If you can check the dropped lines as follows:
SELECT error, line, character, rejected_record FROM table(validate("MY_TEST_DB"."MY_SCHEMA"."MY_CSV_TABLE", job_id=>'xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'))
Details can be found at https://docs.snowflake.com/en/sql-reference/functions/validate.html.
If you want to add those lines to your success table you can copy the the dropped lines to a new table and transform the data until the schema matches with the schema of the success table. Then, you can UNION both tables.
You see that it is pretty much to do for loading a simple CSV file to Snowflake. It becomes even more complicated when you take into account that every step can cause some specific failures and that your file might contain erroneous lines. This is why my team and I are working at Datameer to make these types of tasks easier. We aim for a simple drag and drop solution that does most of the work for you. We would be happy if you would try it out here: https://www.datameer.com/upload-csv-to-snowflake/
Got to make my first steps with typo3, now.
Got an Extension, some tables in ...typo3conf\ext\my_extension\ext_tables.sql and would like to put each table Definition in a separate file, because it gets very long.
Is it possible?
The best would still be to put everything into the ext_tables.sql file as many checks are happening with this file like if you add new fields, remove fields, add tables, the DB compare in the Install Tool can handle that.
Have a look at an example in CMS7
/typo3/sysext/install/Classes/Controller/Action/Tool/UpgradeWizard.php::silentCacheFrameworkTableSchemaMigration()
Where given SQL file is used to perform update. fx:
/typo3/sysext/core/Resources/Private/Sql/Cache/Backend/Typo3DatabaseBackendCache.sql
Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.
I have about 500 fixed width columns in a flat file that I want to apply the same logic to to replace an empty column with null before it goes into the database.
I know the command to replace the empty string with null but I really don't want to have to use the gui to input that command for every column.
So is there a tool out there that can do this all on the back end?
You could look at something like the EzAPI to create your data flow. This this answer, I have an example of how one creates a EzDerivedColumn and sets the formula within it.
Automatically mapping columns with EZApi with OLEDBSource
If you can install third party components, I've seen a number of implementations of a Trim-To-Null functionality on codeplex.com
BIML might be an option to generate your package as well. I'd need to play with that to figure the syntax though.
My googlefu worked a little better after lunch.
I as able to modify about the 5th comment down on http://social.msdn.microsoft.com/Forums/sqlserver/en-US/222e70f5-0a21-4bb8-a3fc-3f365d9c701f/ssis-custom-component-derivedcolumn-programmatically-problems?forum=sqlintegrationservices to work for my needs.
My c# code will now loop through all the input columns from a "Flat File Source" object and add a derived column for each.
How do we redirect error/failed data to another table in SQL Server, during data importing in SSIS 2008 ?
In a particular data flow component - in Configure Error Output, choose to redirect the row. You may need to add some derived columns after that, and then union all your errors from different parts of your package together if you have just one unified error output.
Cade's way will work for any errors.
If you have data that you know in advance you want to redirect (say states that are not in a list of official states or people with no address), then you can do a conditional split and redirect the rows that way. I prefer to check for known problem issues rather than just to rely on something failing insert to avoid sending things to my datbase that might actually go into the filed but which are data I don't want. For instance I got a file that had the phrase "Legistlative restriction" in the last name field - this clearly wasn't a person, so I redirected the rows. But the actual text would have fit in our lastname field and the record would have been inserted if I had just reliedon error output.