I need a bit of advice about planning a PHP script :)
If anyone could help it would be appreciated.
The scenario is this....... about 50 CSV files, all between 2 and 60mb each.
These files contain financial stat information, but as an example Its best to simplify
So we have rows like,
name,code,wins,losses,month
peter,123,5,2,feb
peter,123,6,1,mar
peter,123,3,2,apr
dave,126,2,3,feb
dave,126,1,1,mar
steve,34,6,1,mar
What I need to do is aggregate them, and print out a report for Peter and one for dave and one for steve.
So it would be
name,wins,losses
Peter 14,5
dave 3,4
Steve 24,6
I am thinking that I have to pop these into an array and process it that way, then dump it to a report - but it worries me that I will be dealing with 50mb files.
I know that it would be nice to pull these into a mysql db, but these CVS files will change each night :(
Anybody have thoughts on this?
Tony.
hmm... with files that big I've really no idea...
I'd probably try a preg_replace... but that'd be sloooow as :S (especially using a function in it...)
for each line... $array[] = explode(',',$line);
would probably be fastest I think... (or better yet processing each line instead of adding it to an array to be processed...)
i'd say put them inside mysql database and then do what you want with them that way...
Thanks guys,
I forgot to say, Im running this on an w2k box and the files are being FTP'd down every morning from a unix box.
Although MySQL would be nice - I cant in this situation.
Maybe something like a logfile script could do it? I'll have a look :)
Thanks again,
Tony
the problem is when your files go towards the 60mb mark... that's when dumping your contents to an array and then using php to sort might be troublesome
[edit]does anyone know what sort algorithm php uses?[/edit]
Hmmm thanks Parham.
Its just a shame my boss wont approve MySQL inside the domain - Im only allowed to have it on the web server :(