我有一个非特定于用户的大型多维数组〜100Mb-150Mb,目前将其保存在一个JSON文件中。必须使用来自API的数据每分钟更新一次此数组。
我不确定应该使用$_COOKIE还是$_SESSION存储它,并避免fopen(); fwrite(); fclose();为了性能而写文件(即)。
可能没有内存问题,因为已经使用以下脚本顶部的此设置收集了数据:
ini_set('max_execution_time', 0);
ini_set('memory_limit', '-1');
set_time_limit(0);
数据采集
// Config class for path and other constants
require_once __DIR__ . "/ConstEQ.php";
class EquityRecords extends EQ implements ConstEQ
{
public static function getSymbols()
{
//***************** START: ALL SYMBOLS ARRAY ********************** //
// var: is a filename path directory, where there is an md file with list of equities
$list_of_equities_file = __DIR__ . self::SYMBOLS_PATH;
// var: is content of md file with list of equities
$content_of_equities = file_get_contents($list_of_equities_file);
// var is an array(3) of equities such as: string(4) "ZYNE", string(10) "2019-01-04", string(27) "ZYNERBA PHARMACEUTICALS INC"
$symbols_array = preg_split('/\R/', $content_of_equities);
//***************** END: ALL SYMBOLS ARRAY ********************** //
// child and mother arrays are created to help calling equities in batches of 100, which seems to be the API limit.
$child = array();
$mother = array();
// var: is 100 counter
$limit_counter = self::NUMBER_OF_STOCKS_PER_REQUEST;
foreach ($symbols_array as $ticker_arr) {
$limit_counter = $limit_counter - 1;
$symbols_array = preg_split('/\t/', $ticker_arr);
array_push($child, $symbols_array);
if ($limit_counter <= 0) {
$limit_counter = self::NUMBER_OF_STOCKS_PER_REQUEST;
array_push($mother, $child);
$child = array();
}
}
return $mother;
}
默认情况下,$ _ SESSION的最大容量为128Mb。最大容量$_COOKIE等于$_SESSION吗?
哪种可能使用更快或更应该使用,或者还有其他方法可以避免写入文件来提高性能?
MM们