如果您只想在文件中打印ID,则可以为其编写自定义查询。这样,您就可以避免WordPress所做的一些内部处理。
许多帖子可能会耗尽你的内存,尽管我不认为仅仅选择2100篇帖子的ID就应该消耗134MB内存。算算一下,ID
只需1个字节即可保存,但假设需要4个字节。然而,2100 x 4=8400字节=8.4 KB。显然,PHP需要更多的内部内存来处理、创建对象等。但是有了134MB的内存,我可以轻松地处理几十万个ID。所以很明显,你在其他地方做错了。
无论如何,无论出于何种原因(可能是您需要从product
, 不仅是ID),您还可以使用限制对查询进行分段。如以下代码:
if ( ! defined(\'ABSPATH\') ) {
/** Set up WordPress environment */
require_once( dirname( __FILE__ ) . \'/wp-load.php\' );
}
// $limit determines how many rows you want to handle at any given time
// increase / decrease this limit to see how much your server can handle at a time
$limit = 100;
$start = 0;
// open file handle
$myfile = fopen( dirname( __FILE__ ) . \'/wp_all_import.txt\', \'a\' );
$qry = "SELECT ID FROM `$wpdb->posts` where post_type=\'post\' AND post_status=\'publish\' limit %d, %d";
while( $result = $wpdb->get_results( $wpdb->prepare( $qry, array( $start, $limit ) ) ) ) {
$write_data = \'\';
foreach ( $result as $row ) {
$write_data = $write_data . $row->ID . "\\n";
}
// Generally speaking, writing immidiately to the file is better than
// string concatination, because depending on what you concat and how many times,
// the generated string may become too big (like MB size string).
// On the other hand, writing to files thousands of times in a single script may
// cause I/O delays. So, here I\'ve done a combination of the two to keep both
// string size & I/O within limits.
// Adjust this according to your own situation.
fwrite( $myfile, $write_data );
$start = $start + $limit;
}
// close file handle
fclose( $myfile );
这样PHP将只处理最大
$limit
行数,因此内存限制不应超过。
Note: 永远不要连接成很长的字符串(如MB长),在文件变长之前立即写入文件。它可能会产生一些I/O延迟,但不会耗尽内存限制。