How do I complete hundreds of concatenations in MSExcel (2013) - concatenation

I have a spreadsheet with 202 rows and 692 columns. How do I concatenate the 692 columns for each of the rows?
I have tried this on a smaller dataset: =CONCATENATE(B2:G2&","), then use F9 to replace the B2:G2 with the contents of those cells. That works fine, but I can't copy it down to the other rows. Also, when trying in my real data (692 columns), there are too many data fields for excel to complete the concatenation.
Any tips please? I'm unfamiliar with coding and have only a basic level of excel knowledge.

Related

Unable to sort the records in Data Flow Task in SSIS

I am trying to load around 7 million records from a flat file into a database. I need to sort these records for merging. My sort task within DFT(Data Flow Task) is able to read 7 million rows as input but outputs only 90k rows. Is there anny limit on the number of rows that can be sorted in SSIS? If yes, what are the possible alternatives.
The issue was blanks and null values existing in certain columns. I added the conditional split transformation and removed the null and blank values. The null and blank values in my file broke the sort transformation resulting in just few rows getting sorted.

Copy a very large number of rows from one sheet to another, excluding blank rows in Excel 2010

I'm currently working on an excel workbook using the following formula to copy all rows from one sheet (Creation_Series_R) to another one, excluding empty rows.
{=IFERROR(INDEX(Creation_Series_R!C:C;SMALL(IF(Creation_Series_R!$C$3:$C$20402<>"";ROW(Creation_Series_R!$C$3:$C$20402));ROW()-ROW(Creation_Series_R!$C$3)+1));"")}
And the formula works very well. Except, when I did my proof of concept I only had a few rows but with the final data, I need to work on 20400 rows... adding to the fact that I have 17 columns, and 3 similar sheets with similar formula, my workbook takes an hour to compute every time I input just one value.
This workbook is designed as a way for a client to enter data, and then it reorganize the data so that it can be imported directly in our software. I already limited the number of data the user can enter per workbook (to their very big disappointment), so I can't really reduce it to less than 20400 rows (it's only a 100 funds financial data).
Is there a way, even maybe using macro, I could do this more efficiently ?
The big block of array formulas is killing your performance (time-wise).
If your data is in column A through Q, then I would use column R as a "helper" column. In R2 insert:
=COUNTA(A2:Q2)
and copy down. The macro would:
AutoFilter column R
Hide all rows showing 0 in column R
Copy the visible rows and paste elsewhere as a block

Transform data into variable amount of columns

I've been struggling with a challenge I've inherited which I think is possible but can't easily see what the solution would be. It's along the lines of the return rows as columns scenario (which I appreciate there are a lot of posts about already but I think this is slightly different). I thought a PIVOT would be useful but I don't need to aggregate any values. The metadata describing each document is defined at folder level so documentID = 1 should be tagged with the values 111, ABC, DEF and GHI.
So I have a table as follows:
Current Dataset:
The aim is to have the data presented as this instead so all the metadata tags for both the folders and document are stored in one row (ultimately I'll be exporting to CSV)
Desired Dataset
I have the same problem. Not worked out completely yet. I have only done it twice now manually. And planning on making a macro.
I work with vertical lookup and sorting and deleting data.
In your case sort the data based on metatag_value, trasnfer all the BBB, CCC, etc to new columns. then write a VLookup in the Metag_value2, etc columns at the AAA row.
Copy/paste the entire datasheet as values to get rid of the formulas, and then delete all the rows without AAA in them.

How to turn huge Excel sheets into a database?

I have 12 very-large Excel sheets.
Each one is 102 columns wide, and an average of 600K rows long.
They're all identical in structure.
Quite often I need to run a specific query, from only a subset of columns, with a specific criteria. And the process of doing so by opening each file and filtering/sorting/finding what I need has become very tedious.
If they were all in a database, such queries would be so much easier.
I tried MS Access, and I tried SQL Server Express.
Both die on me during the respective Import wizards.
And the failure is certainly due to the size of the data set, because if I manually trim a file to say 10 rows, the import works fine.
Any ideas how to do this? I'm open to using Access, or SQL Server Express, or any other tool that does the job really.
Note: Some of the columns contain records that contain commas, hence several suggestions I found to turn the files into CSVs prior to import, ended up with broken structure.
Edit 1: They're 12 separate workbooks, with 1 sheet in each. And the aim is create a single table where all the data is appended.

Importing CSV files to Excel 2003 with more than 256 columns

I like to import some csv files with 375 columns into excel 2003. But excel 2003 limits the number of columns to 256.
I must edit this question due to relevancy
I have several CSV-files with exactly 375 columns and different number of rows. How can I delete every second (odd) column which contains unnecessary information. The unneeded columns are from number 5 to 375 (5, 7, 9, ...373, 375).
Is there a useful solution to delete all unneeded columns with a script in VBS?
I will keep my fingers cross to get a solution and thank you in advance.
Mike
Have a look at the CSV Editors. They will generally allow you to delete columns. You will have to specify which columns to delete (i.e. you will not be able to delete columns via the content). But you should be able to delete enough to view / edit in Excel. Also you could
cut/paste from the Csv Editor into Excel as well.
Here are several (first 2 are free, the next 2 have a 30 day trial)
http://csved.sjfrancke.nl/
http://recsveditor.sourceforge.net/
http://www.whitepeaksoftware.com/killink-csv-editor/
http://www.gammadyne.com/csv_editor_pro.htm
You could also google "CSV editor" / check SourceForge for more
After a long time and no response...
I tried several CSV-editors (free or purchasable), but nobody seems to have the options to delete columns by having the string "percent" in their header). Only ReCsvEditor_0.80.6 is working with filters, but the filter seems not to work properly.
If anyone have a suitable VBS (WSH) script solution I would be very grateful.

Resources