From: "Richard S. Crawford" on
I have a script that connects to an external database to process about 4,000
records. Each record needs to be operated on pretty heavily, and the script
overall takes about half an hour to execute. We've hit a wall where the
script's memory usage exceeds the amount allocated to PHP. I've increased
the allotted memory to 32MB, but that's not ideal for our purposes.

So my thought is, would it be more efficient, memory-wise, to read the
database entries into an array and process the array records, rather than
maintain the database connection for the entire run of the script. This is
not an issue I've come across before, so any thoughts would be much
appreciated.

Thanks in advance.

--
Richard S. Crawford (richard(a)underpope.com)
http://www.underpope.com
Publisher and Editor in Chief, Daikaijuzine (http://www.daikaijuzine.com)
From: Bastien Koert on
What I usually do is to pull a limited set of records ( like 10 or 50
) and the do the operations on them, update a column in that table to
mark them completed and use JavaScript to reload the page and pull the
next set out where that flag field is null.

No memory issue, no need to large timeouts and it's self recovering.

Bastien

On Tuesday, March 16, 2010, Richard S. Crawford <richard(a)underpope.com> wrote:
> I have a script that connects to an external database to process about 4,000
> records. Each record needs to be operated on pretty heavily, and the script
> overall takes about half an hour to execute. We've hit a wall where the
> script's memory usage exceeds the amount allocated to PHP. I've increased
> the allotted memory to 32MB, but that's not ideal for our purposes.
>
> So my thought is, would it be more efficient, memory-wise, to read the
> database entries into an array and process the array records, rather than
> maintain the database connection for the entire run of the script. This is
> not an issue I've come across before, so any thoughts would be much
> appreciated.
>
> Thanks in advance.
>
> --
> Richard S. Crawford (richard(a)underpope.com)
> http://www.underpope.com
> Publisher and Editor in Chief, Daikaijuzine (http://www.daikaijuzine.com)
>

--

Bastien

Cat, the other other white meat
From: Ryan Sun on
Maybe you want to optimize your script first, and I don't think read
entire data set into array would save you much time.
Don't create new variables when its unnecessary, cache data when its
necessary, try memcached,
and I think heavy php cli scripts are always likely to resume a lot of
resource, I think GC of php is not so reliable... or maybe I don't know
how to use it.

On 3/16/2010 7:13 PM, Richard S. Crawford wrote:
> I have a script that connects to an external database to process about 4,000
> records. Each record needs to be operated on pretty heavily, and the script
> overall takes about half an hour to execute. We've hit a wall where the
> script's memory usage exceeds the amount allocated to PHP. I've increased
> the allotted memory to 32MB, but that's not ideal for our purposes.
>
> So my thought is, would it be more efficient, memory-wise, to read the
> database entries into an array and process the array records, rather than
> maintain the database connection for the entire run of the script. This is
> not an issue I've come across before, so any thoughts would be much
> appreciated.
>
> Thanks in advance.
>
>