mysql - memory allocation problems with CakePHP -


i still have problem mempry usage of cake. read alot of threats , tried out 1000 things nothing help. use cakephp 2.4.x. working mysqldb. wrote api select log-data database. test explain confusing me much. i've select returns 1400 rows 11 columns. general memory usage (using memory_get_usage) 4mb. after select that:

$condition = array('conditions'=> array(       'logbackend.created_at between ? , ?' => array(                                             $this->params['url']['from'],$this->params['url']['to']                                        )));      $this->data = $this->logbackend->find('all', $condition). 

my memory usage explode more 9mb. there still no return or work data @ all. select. 9mb not if select more, should possible round 100k rows. 256mb run out of space. can tell me reason ridiculous memory usage?

best regards

just use paginate instead, , process data in chunks. cake geared towards chunks of 20 or rows going view. want produce report based on millions of rows. cakephp has function called paginate, can tricked doing mysql limit 100000,1000 pretty efficiently. found these hacks googling pretty intensely, there different syntax cakephp 1.3, , if nobody posts information on cake 3, , changes, come , post day.

// cakephp 2.4.1 $condition = array('conditions'=> array('logbackend.created_at between ? , ?' => array($this->params['url']['from'],$this->params['url']['to'])));  $this->paginate['conditions'] = $conditions['conditions']; $this->paginate['limit'] = 1000; // use ['fields'] = array('filed1', 'field2'); limit memory  /******* here trick paginate iter in ctrlr *****/ $this->request->params['named']['page']=1; $this->request->params['paging']['modelname']['nextpage'] = true; // chunks while ($this->request->params['paging']['modelname']['nextpage']) {     $rows = $this->paginate("modelname");     // boil down chunk of data      $this->request->params['named']['page']+=1; } 

it doesn't make sense pass millions of rows view. while loop overwrite $rows each time, re-using core memory. challenge find clever way of distilling data, or writing new table during loop used in next pass of refinement. execution still slower on cakephp due model processing overhead, if there lot of unneeded columns, use paginate['fields'] process fields need. fields same thing columns.

php philosophically tied apache, isn't meant kind of thing. it's meant results in 30 seconds or less , render web page. isn't cake blame. python , ruby might better, perl. cry baby when have sift through tools format in html.


Comments

Popular posts from this blog

Why does Ruby on Rails generate add a blank line to the end of a file? -

keyboard - Smiles and long press feature in Android -

node.js - Bad Request - node js ajax post -