php - Sending data from server to client? -
i have php server file , html client file, html file send ajax requests server retrieve data every 500 ms
, although works expected it's causing high usage of memory , cpu on client's device.
php
if(isset($_post['id']) && $_post['id'] != '' ) { $id = $_post['id']; $select = $con->prepare("select * data id=?"); $select->bind_param('s', $id); $select->execute(); $result = $select->get_result(); while($row = $result->fetch_assoc()) { echo $row['column 1'] . "\t" . $row['column 2'] . "\n"; } }
ajax
function send(){ var formdata = new formdata(), id = document.getelementbyid('id').value; formdata.append('id', id); var xhr = (window.xmlhttprequest) ? new xmlhttprequest() : new activexobject('microsoft.xmlhttp'); xhr.open('post', 'server.php', true); xhr.send(formdata); xhr.onreadystatechange = function(){ if(xhr.readystate == 4 && xhr.status == 200){ console.log(xhr.responsetext); } } } setinterval(function(){send()}, 500);
i find alternative solution ajax, instead of sending numerous requests server , retrieving same data of time, more efficient if server can interact client on data change or update.
i can't use php socket
or httprequest
medthods not installed on hosting server , i'm not sure if later works. way can think of using sessions
.
according this php server store users sessions on same directory on server, therefore may possible change sessions variables particular user directly on file. problem data in files serialized , i'm not sure how de-serialize data , re-serialize them , save new data!
even if able find way store updates on session file, still need use setinterval listen session's variable change every 500ms
although it's not ideal better using xmlhttprequest
in terms of memory , cpu usage.
so what's best way this? appreciated.
update:
i realized session
wont work because can read server not client, therefore have send ajax request server variables trying avoid.
i tried long polling had many problems it, flush
, ob_flush()
doesn't work on server , can't change ini
settings. when trying infinite loop can't break on data change:
if(isset($_get['size']) && $_get['size'] != '') { $size = (int)$_get['size']; $txt = "logs/logs.txt"; $newsize = (int)filesize($txt); while(true) { if($newsize !== $size) { $data = array( "size" => filesize($txt), "content" => file_get_contents($txt)); echo json_encode($data); break; } else{ $newsize = (int)filesize($txt); usleep(400000); } } }
it keeps going on , on, if logs.txt
size increase wont break! how can make break , echo data on size increase?
update 2:
it turned out php cache filesize when calling filesize()
method therefore above loop run indefinitely, solution use clearstatcache()
method wich clear stored cache of file size allowing loop break on filesize changes.
okay, after many tests , long research came conclusion php server can never interact specified client directly unless client send request server first.
the reliable solution found use infinite loop break on data change, reduce frequency of ajax requests server considerably, hence increasing performance , decreasing usage of memory , cpu on client's device, here how goes:
php 1 (handles data update or new data insert database):
$process = $_post['process']; $log = "/logs/logs.txt"; if($process == 'update'){ //execute mysqli update command , update table. $str = "update on " . date('d/m/y - h:i:s') . "\n";//add text logs file (can increase logs.text size) file_put_content($log, $str, file_append);//file_append add string end of file instead or replacing it's content } else if($process == 'insert'){ //execute mysqli insert command , add new data table. $str = "added new data on" . date('d/m/y - h:i:s') . "\n"; file_put_content($log, $str, file_append); }
the above code insert/update data, create file log.txt
if not existed , add additional text on each request. log.txt
used later in infinite loop "below" , break loop when it's size change.
php 2 (handles reading data requests):
if(isset($_post['id']) && $_post['id'] != '' && isset($_post['size']) && $_post['size'] != '') { $id = (string)$_post['id']; $init_size = (int)$_post['count']; $size = file_exists('logs/logs.txt') ? (int)filesize('logs/logs.txt') : 0;//$size logs.txt size or 0 if logs.txt doesn't exist(not created yet). $select = $con->prepare("select * data id=?"); $select->bind_param('s', $id); while(true){ //while(true) loop indefinitely because condition true met if($init_size !== $size){ $select->execute(); $result = $select->get_result(); while($row = $result->fetch_assoc()) { $data['rows'][] = array( "column 1" => $row['column 1'], "column 2" => $row['column 2'], ); } $data['size'] = $size; echo json_encode($data); break; //break loop when condition ($init_size != $size) met indicates database has been updated or new data has been added it. } else{ clearstatcache(); //clears chached filesize of log.txt $size = file_exists('logs/logs.txt') ? (int)filesize('logs/logs.txt') : 0; usleep(100000) //sleep 100 ms } } }
ajax:
var size = 0; //declares global variable size , set it's initial value 0 function send(s){ var formdata = new formdata(), id = document.getelementbyid('id').value; formdata.append('id', id); formdata.append('size', s); var xhr = (window.xmlhttprequest) ? new xmlhttprequest() : new activexobject('microsoft.xmlhttp'); xhr.open('post', 'server.php', true); xhr.timeout = 25000; //set timeout on xmlhttprequest 25 sec, servers has short execution tiemout, in case it's 27 sec set value 25 sec. xhr.send(formdata); xhr.onreadystatechange = function(){ if(xhr.readystate == 4 && xhr.status == 200){ var data = json.parse(xhr.responsetext); size = data.size; console.log(data.rows); settimeout(function(){send(size)}, 100); //re-initiate request after receiving data } } xhr.ontimeout = function(){ xhr.abort(); //abort timed out xmlhttp request settimeout(function(){send(size)}, 100); } send(size);
this not ideal solution reduced xmlhttp requests 2/sec low 1/25 sec, hope able come better solution.
Comments
Post a Comment