I have a problem where I'm importing a CSV file using the SQL import wizard but the values being imported get changed after this process completes which I'm not sure why this is happening. I have selected the code page as UTF-8 and I can see that the values are being displayed correct here as this is a column which includes Spanish translation:
Once the process completes I can see that SQL is showing these completely wrong as you can see below:
How can I get it imported to keep the original Spanish translation?
My database collation is SQL_Latin1_General_CP1_CI_AS and this can't be changed.
Any help would be appreciated!
Seems that this might be related to CSV file export as I had just changed to rather export to Txt file and this imported correct and all is as per the info passed so not really sure but it's all good now. Thanks anyways!
Related
I am going to Tools->Option, (etc) and setting it to a Custom Delimiter of a | but when I export as text or export as a file, it keeps being tab-delimited instead.
I followed the instructions found at this URL (one example of a few saying the same thing)
https://mssqltrek.com/2012/07/24/obtaining-pipe-delimited-results-from-sql-server-using-ssms/
but it just comes out tab-delimited.
Am I missing something? This is SSMS 14.
Apparently, a restart of SSMS was needed.
Environment:
Windows 10
XAMPP Win32 7.0.23
PHP: 7.0.23
MariaDB 10.1.28?
Magento 2.1.9
I cannot export any databases. I get that "Warning: a form on this page has more than 1000 fields" message, and then the export does not work. From what I've read, you're supposed to change the max_input_vars in php.ini. I checked my only php.ini file. The line was commented out. I removed the semicolon and increased the value - a few times. Here is what it looks like now:
; How many GET/POST/COOKIE input variables may be accepted
max_input_vars = 10000
No matter what I increased it to, I still get the same 1,000 fields message when I try to export a database. I searched the entire XAMPP directory and sub-directories, but found no other php.ini except for the one in the xampp/php directory. I would not think it needs to be higher than 10,000 for a fairly new Magento database with only a few products in it. When I imported the database into XAMPP, it was less than 2 MB in total size. I tried exporting another Magento database for the unmodified demo site, and I get the same warning and result.
Can someone help me? Thanks.
UPDATE:
info.php created and verifies the current max_input_vars = 10000. I am selecting the database in phpmyadmin, and then clicking on the export button at the top. The export, in simple mode, selects all tables for the given database for export.
put info.php into document root with
<?php
phpinfo();
?>
call the file in browser and check the value for max_input_vars.
Also specify if you are exporting a whole database or a query result for further advice
I've recently started using DataGrip since my DBs are in different management systems -- makes it easy to have them one platform.
However, I've ran into several issues/challenges I just cannot find answers for.
Would anyone happen to know how:
Change the settings such as, when loading data, every column is being imported as TEXT by default
(don't want to change INT to TEXT and NUMERIC to TEXT most of the
time)
Import multiple CSVs at the same time: for e.g. import 'n' CSV files with the same settings vs import individually the CSV files one-by-one (which takes significantly more time).
Thanks for your help,
There is no possibilities like these for now.
I added your comment here: https://youtrack.jetbrains.com/issue/DBE-4917
Created a ticket, please follow: https://youtrack.jetbrains.com/issue/DBE-5277
I got data file with .LOD extension from which I have to import data in the PostgreSQL database. I never worked with .LOD files. Can someone please help me with commands or steps that I need to follow to import data in my database?
If your text data are well formatted try using the COPY function http://www.postgresql.org/docs/9.4/static/sql-copy.html or you can try http://pgloader.io/ too. If the datas are not well formatted you'll have to write a script to do the load.
There's the need to send data to an external interface - a financial statutory report to be sent periodically for some country's government which expects data as text files with no separator between fields: fixed width, left-padded with zeros or blank spaces (depending on data type) to fill the space.
So we created an RDL with the required fields and were looking at adding an export option in for this.
Can an export option be created to export in this format?
We'd like to avoid having to CAST all appropriate fields to NVARCHAR in the stored procedure that feeds the report, left-pad them there and finally concatenate everything - unless that's the only option to accomplish it.
Also, we definitely don't want to code an ad-hoc export method in .NET, there won't be much reports with this exporting option so it's not worth it. Besides it's not so easy to convince the server team to deploy DLLs to servers - if there's something already coded, open source or shareware, we could have a look onto it though wouldn't be the preferred solution.
Previously we had been able to configure the XML configuration file for Reporting Services 2008 to add a new export option for export to Text with fields separated by pipes, so were thinking to try this first. Unfortunately discovered if you use the Text or CSV export motors but specify no separator, comma is used by default.
Any ideas?
In my opinion, Reporting Services isn't the right tool for this job. You're trying to bend a reporting tool into a data migration tool. While on the surface I guess it looks similar - outputing formatted data - the structure of the fixed length text file isn't really what Reporting Services is good at producing out of the box. We do a similar fixed length export using a scheduled SQL Server Agent job. You could also write a console app and run it manually or use Windows Scheduled Tasks to run it.
If you're determined to use Reporting Services for some functionality reason (for example, the export has to use the same parameters to filter the data as the report that is being viewed) then I'd simply create a second report based on the first with no header or footer, just a detail section with a table containing one wide field with an expression in it that creates the formatted text you need for your export. Put a link in the header of the main report called "Create export file" that links to this new report and passes the parameters across then export it to some non-paginating format like CSV.
Otherwise, you're looking at creating a custom renderer which you don't want to do and I agree is overkill for this.