Sometimes when using external url apis to process a large amount of data (eg checking all posts on a wordpress site to see if youtube videos have been deleted) you encounter the dreaded mysql or web server time out issues – the page times out even if set_time_limit(0) is set on mysql connections or worse web server. One way to resolve this, if not elegant, is to do limited requests per script call (Limit in mysql query or via a loop counter), and use a javascript script to reload the script / page again until all the data has been processed.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 | <?php // $iprocessed is the number of records processed so far // $itotal is the total number of records to be processed if($iprocessed<$itotal) { $countdown = "<form name='d'>\n <input style='border: none; font-weight: bold; width: 200px;' type='text' name='d2'>\n </form>\n"; $countdown .= "<script> <!-- var milisec=0; var seconds=0; document.d.d2.value='Pausing'; function display(){ if (milisec>=9){ milisec=0; seconds+=1; } else milisec+=1; if (seconds>1) { document.d.d2.value='Reloading'; location.reload(true); } else setTimeout(\"display()\",100); } display(); //--> </script>"; echo($countdown); } ?> |