How do I keep my page from timing out?

Glenn

Member
I have a php page that is doing a LOT. It runs for almost 2 or 3 minutes then times out without finishing. I set up a cron job to call the page and it does not finish there either. I am using it to pull information from a site that I am selling though an affiliate program. It is going in a reading several pages from that site and pulling out several hundred items for sell and placing them on my site. Actually, saving them in a database and my site pulls them from there.

Is there a way to keep it from timing out? I'm assuming that is what it is doing since it will not finish. If I skip the first portion, it will continue further down that before. The more I skip, the closer it will come to finishing.
 

Glenn

Member
I have a lot of file_get_contents(); functions in it which seems to be what it taking such a long time. Maybe 15000 of them inside a double loop. One loop runs 20 times with the inbedded loop running 15 times. The whole thing is called 50 times.
 

CaldwellYSR

Member
Sounds to me like you know exactly what you need to get rid of :p FYI your php questions would be better suited in the programming section or on a site like StackOverflow where they are more focused on programming.
 

Phreaddee

Super Moderator
Staff member
sounds to me like its hitting a snag and getting stuck.
15000 functions seems to be honest, bloody stupid, and one small error in that and the whole lot wont work.

if you know PHP inside out, go through your code with a fine tooth comb and find the error. if you dont, you will need to employ another to have a look at it. like matthew said, it sounds like a programming issue, and you would probably get a better response from a site like stackoverflow.
 

Glenn

Member
I found a solution to this. It was set up on a cron job to run once or twice a day. I set up a database to run 2 at a time, of the 50, and run once an hour. It just rotates through the database so each will run through once a day.
 

leroy30

New Member
I would expect your 'snag' is that you are trying to download 15,000 web pages all on one page load.

Why don't you create a background worker process that iterates through the items one by one. That way your website or application can continue to function without becoming stuck in a for-loop.

Basically you want to queue the processes not loop through them.
 

leroy30

New Member
Oh and why do you want to store all 15000 products all in one? Why dont you create some sort of caching process.

i.e. A customer visits your database and is looking for products 1, 2, 3 and 4. If the product has been in the database for longer than 2 hours then update products 1, 2, 3 and 4 before serving them to the customer and restamp the date/time on each record so the next customer to come along within the next 2 hours just gets the cached version from your database.

I think you just need to think about improving your method of extracting the data. Any large amount of data scraped from someone elses website is going to take a while.
 
Top