I apologise for the lack of postings recently on this forum, but this is due to a surfeit of work, and my clients must come first.
At 4:30pm on a Friday afternoon, I have finally finished a major data crunching exercise, and am in a position to write it all up. However before I do that, I need to distance myself from the pain and backache of analysing something in excess of 36 files with over 180 separate data sets in excel (and perhaps close to half a million records). The back pain and also swollen ankles are due to my ability to sit mesmerised in front of a computer screen for hours on end not moving anything but my mouse hand - obsessive and entirely my own fault.
I also need to distance myself from the mental pain of it all. Excel is a fine product and I am a major fan of spreadsheets - they are very powerful and when used by skilled people can do amazing things. In the hands of incompetents, they will produce rubbish - but that is the way with many computer tools, particularly statistical ones. If you feed garbage in, you get garbage out, albeit with pretty graphs.
However for large data sets, Excel isn't always the answer, and my mental pain comes from the fact that about two thirds of the way through my epic analysis task (which took 4 solid days I may add), I realised that I would have been better off and would have done the work more quickly if I had done it differently.
If I had spend half a day or a day developing an SPSS file with the Excel data all concatenated into one file (easy enough as it was all in a common format), I could then have spend a happy hour programming SPSS to do the analyses it has taken me so much time to do, and a further day to push the SPSS output into a pretty format for the report I will now spend the weekend writing.
There is a rather painful lesson to be learnt here I think!
Comments