It *is* bad design. It's simply not a good idea to read the whole site content into the memory on each request. In my opinion the core can be changed very easily to just read a line at a time or start from a specific line from index. Although with small sites it's hardly noticeable. It's still not good practice to do it the way it's done currently. I think many hosts have limited resources per user and any site could just choke if it became more popular.The content file is read completely into memory on every page call. This is historically established, and quite some plugins rely on the complete content being accessible through global variables. One might argue that this is a bad design, but it's nothing that could be changed without breaking those plugins.
Just rolling my eyes. I didn't expect it to be like that. I'd just rewrite it... and destroy the plugins. How many plugins would be affected? All of them??
For a start your solution seems like a good improvement and doesn't break things... but... it needs to be rewritten... even if the content was under 1 megabyte, still it's a waste of limited resource...
I think this needs to be addressed in the next major release.
Edit: Wordpress seems bloated and heavy to me while cmsimplexh seems easier to adopt. Joomla is quite horrible too. Well... i'm not yet familiar with the code of cmsimplexh but i'm just skimming it thru to see how it could be changed...
Edit2#:
I have checked how it works. And if i'm not completely wrong the function in adm.php:728 shows how the content is handled on each request.
file: adm.php: line: 728
function read_content_file($path)
My suggestions:
- Generate "content map" with start- and end- bytes information, could be saved within content/pagedata.php???
- Content map would be regenerated each time the content is edited
- On each page load, only the necessary page content would be loaded.
offset= start reading at this byte
maxlen= end at this byte
file_get_contents($path . '/content/content.htm',false,null,offset,maxlen)
This would eliminate the need to read all the content within an array and find each pagebreak on every single page load. Huge difference!
- content.htm should be read in chunks only to make it possible to process bigger filesizes. Content.htm could contain the data split in chunks of 500 KB. One chunk could be read at a time when generating "the map" or searching. No separate files.
But how much would that actually affect the plugins is a mystery to me.
Any ideas?