excel to access records with macro - database

not sure even where to start on this. i have an excel workbook with tabs for different tracking sheets. some for maintenance tracking and some for personnel work hours for different jobs. it is very time consuming to pull, copy and paste the results i need and then compare them. i already know that an access database would be a better product to use for tracking and pulling results together, however my boss always favors excel and does not want to get rid of the current products used by myself, the boss and a hand full of others in the office. so i already know what needs to be converted to a database but have my hands tied for the time being.
so here is what i am trying to accomplish. my boss has set up macros for several of the excel sheets to archive the days worth of results in maintenance and workers hours spent doing jobs. what it currently does is, makes a copy of the sheet and saves that sheet to a network folder for us to look at if we have a problem and need to check results. i would like to recode the macro to instead save those results to records in an access table. some of the info is missing and would need to be created on the fly as the record is created. because the sheet tracks only results for the day, there is no column in the excel sheet for the date. so the date will have to be added to the record as it is created, which is usually yesterdays date. results get fed into the excel sheet and stay until the next morning when we hit the archive macro button to save yesterdays results to an excel.xls file.
here is a screen shot of the sheet.
dispatch log
so when i try to bring all the days together i have to copy and paste from multiple xls files to one just to get the stats i need. i know my way around access better then i do excel so this would be a great time saver for me. if i could change the archive code to populate an access table would be great help. anybody got any ideas? thanks in advance for helping me.
code used to archive the excel sheet.
Sheets("DISPATCH LOG").Select
ActiveSheet.Copy
Set WB = ActiveWorkbook
FileName = Format(Now(), "yyyymmdd")
On Error Resume Next
Kill "Y:\Dispatch_Archive" & FileName
On Error GoTo 0
WB.SaveAs FileName:="Y:\Dispatch_Archive\" & FileName
'Delete the temporary file
WB.ChangeFileAccess Mode:=xlReadOnly
WB.Close SaveChanges:=True

Maybe you could link access to the workbooks as mentioned here
http://office.microsoft.com/en-gb/access-help/import-or-link-to-data-in-an-excel-workbook-HA001219419.aspx
Kind of using the spreadsheets as four backend databases and then query and update them using the access front end

Related

Method/Process to Handle Data in Persistent Manner

I've been banging my head against this for about a year on and off and I just hit a crunch time.
Business Issue: We use a software called Compeat Advantage (General Ledger system) and they provide a Excel add-in that allows you to use a function to retrieve data from the Microsoft SQL database. The problem is that it must make a call to the database for each cell with that function. On average it takes about .2 seconds to make the call and retrieve the data. Not bad except when a report has these in volume. Our standard report built with it has ~1,000 calls. So by math it takes just over 3 minutes to produce the report.
Again, in and of itself not a bad amount of time for a fully custom report. The issue I am trying to address that is one of the smaller reports ran, AND in some cases we have to produce 30 variants of the same report unique per location.
Arguments in function are; Unit(s) [String], Account(s) [String], Start Date, End Date. All of this is retrieved in a SUM() for all info to result in a single [Double] being returned.
SELECT SUM(acctvalue)
FROM acctingtbl
WHERE DATE BETWEEN startDate AND endDate AND storeCode = Unit(s) AND Acct = Account(s)
Solution Sought: For the standard report there is only three variation of the data retrieved (Current Year, Prior Year, and Budget) and if they are all retrieved in bulk but in detailed tables/arrays the 3 minute report would drop to less than a second to produce.
I want to find a way to retrieve in line item detail and store locally to access without the need to create a ODBC for every single function call on the sheet.
SELECT Unit, Account, SUM(acctvalue)
FROM acctingtbl
WHERE date BETWEEN startDate AND endDate
GROUP BY Unit, Account
Help: I am failing to find a functional way to do this. The largest problem I have is the scope/persistence of data. It is easy to call for all the data I need from the database, but keeping it around for use is killing me. Since these are spreadsheet functions after the call the data in the variables is released so I end up in the same spot. Each function call on the sheet takes .2 seconds.
I have tried storing the data in a CSV file but continue to have data handling issues is so far as moving it from the CSV to an array to search and sum data. I don't want to manipulate registry to store the info.
I am coming to the conclusion if I want this to work I will need to call the database, store the data in a .veryhidden tab, and then pull it forward from there.
Any thoughts would be much appreciated on what process I should use.
Okay!
After some lucking Google-fu I found a passable work around.
VBA - Update Other Cells via User-Defined Function This provided many of the answers.
Beyond his code I had to add code to make that sheet calculate ever time the UDF was called to check the trigger. I did that by doing a simple cell + cell formula and having a random number placed in it every time the workbook calculates.
I am expanding the code in the Workbook section now to fill in the holes.
Should solve the issue!

Create database in SQL from specific excel files

I am still the newbie in this field so I really appreciate it if you could help me.
This would be my task. I used "Summary" tab as an example.
Every week I get a couple of excel files that have the same format. I need to "edit" that format in format that I need in SQL.
So, in the example you can find 2 sheets: "Summary" and "AAA" and these are the original tabs that I get from my colleagues. Because they are "merged" my first step is to unmerged them and you can see the results of that in sheet "Summary after unmerging".
After unmerging them, I need to put the data in new "order" and you can see that results in sheet "FINAL Summary table in SQL".
As I said, I have a couple of excel files that I need to "union all" (data) in SQL and create one table.
So, if this, how would I say, first step is doable the next step would be to update this "ONE TABLE" every week form those excel files that I get.
Thank you in advance.
P.S. how to upload excel file here

Complex linking in excel (dynamically?)

I'm not the best at using spreadsheets but I've given a task and its possible I may be a little out of my depth (I'm more of a web programmer)
I have two sheets:
One called Area A and one called Area B with headings:
Time - Location - Reference
I need to set up a new sheet with these column headings:
Time - Reference - Location - Area
Then make a sortable list (I can do this bit)
The Location A & B sheets will be constantly changing and this will need to be reflected in the new sheet when ever it is opened (maybe some sort of onload style event?)
Any ideas on the easiest way to do the above (or if indeed it is doable)? I don't want to be spoon fed, I'd be happy to be pointed in the right direction or to be given some keywords I can Google (I learn better this way).
Many thanks!
This type of data manipulation is something that excel is not good for and is prone to errors.
The best two good ways to do this.
Manually
On sheet "Area A" add a column with area name I.e. Area A. Do this for each "data" sheet. Then manually or via vba copy and paste one sheet at a time to you're aggregated sheet.
Programmatically using VBA
Loop through each sheet and copy and paste to the aggregated sheet adding a column with the sheet name as you paste.
For either of these methods the important thing to do is build in a few checks on counts at the end to make sure your not missing any data.

Importing CSV to database (duplicate entries)

My job requires that I look up information on a long spreadsheet that's updated and sent to me once or twice a week. Sometimes the newest spreadsheet leaves off information that was in the last spreadsheet causing me to have to look through several different spreadsheets to find the info I need. I recently discovered that I could convert the spreadsheet to a CSV file and then upload it to a database table. With a few lines of script all I have to do is type in what I'm looking for and Voila! Now I just got the newest spreadsheet and I'm wondering if I can just Import it on top of the old one. There is a unique number for each row that I have set to primary in the database. If I try to import it on top of the current info will it just skip the rows where the primary would be duplicated or would it just mess up my database?
Thought I'd ask the experts before I tried it. Thanks for your input!
Details:
the spreadsheet consists of clients of ours. Each row contains the client's name, a unique id number, their address and contact info. I can set the row containing the unique ID to primary, then upload it. My concern is that there is nothing to signify a new row in a csv file (i think). when I upload it it it gives me the option to skip duplicates but will it skip the entire row or just that cell causing my data to be placed in the wrong rows.. It's apache server IDK what versions of mysql. I'm using 000webhost for this.
Higgs,
This issue in database/ETL terminology is called deduplication strategy.
There is not a template answer for this, but I suggest these helpful readings:
Academic paper - Joint Deduplication of Multiple Record Types
in Relational Data
Deduplication article
Some open source tools:
Duke tool
Data cleaner
there's a little checkbox when you click on import near the bottom that says 'ignore duplicates' or something like that. simpler than i thought.

Manual Entered Data On Excel Ms Query Is Misaligned After Refresh

I have done an MS SQL Query in excel.
I have added extra colums in the excel sheet which I want to enter manual
data in.
When I refresh the data, these manually inputted columns become misaligned
to the imported data they refer to.
Is there any around this happening.
I have tried to link the imported data sheet to a manual data sheet via
vlookup but this isn't working as there are no unique fields to link together.
Please help!
Thanks
Excel version is 2010.
MS SQL version is 2005.
There is no unique data.
Because excel firstly looks like this.
when we entered a new order in to database Excel looks like this
Try this: in the External Data Range Properties, select "Insert entire rows for new data".
Not sure, but worth a try. And keep us updated of the result !
edit: And make sure you provide a consistent sort order.
There is no relationship to the spreadsheets external data and the columns you are entering. When refreshing typically the data is cleared and updated though there are other options in the external data refresh menu you could play with. You could play around with the External data options in the menu to see if changing the settings on what happens with the new data would help.
If you want your manually entered data to link to the data in the embedded dataset, you have to establish the lookup with a vlookup or some formula to find the rows info and show it.
Basically you are thinking the SQL data on the spreadsheet is static, but it isn't unless you never refresh it or disconnect it from the database
note that Marcel Beug has given a full solution to this problem in a more recent post in this forum # Inserting text manually in a custom column and should be visible on refresh of the report
he has even taken the time to record an example in a video # https://www.youtube.com/watch?v=duNYHfvP_8U&feature=youtu.be

Resources