Script execution time on a shared server
I am trying to get some wild estimates of how much of a resource pig some code I wrote really is. I should mention it is only something that a user would run once during a conversion process processing some images.
The script is resizing the image as a thumb, saving it and making a DB entry for each image.
If a script took 5 seconds* to execute and process 56 moderate (1.07 MB (1,127,908 bytes) of data ) images on localhost with all the available resources of the computer, how hard of a hit would that be on the average shared server?
I may test it for myself, but I assume twice as many images would just double the time.
*Based on microtime() start to finish