简体   繁体   中英

PHP and XML: The cost of parsing a large XML file every page request

What is the cost of parsing a large XML file using PHP on every page request?

I would like to implement custom tags in HTML.

<?xml version="1.0"?>
<html>
    <head>
        <title>The Title</title>
    </head>
    <body>
        <textbox name="txtUsername" />
    </body>
</html>

After I load this XML file in PHP, I search for the custom tags using XPath and manipulate or replace them.

Is this very costly or is it acceptable? What about applying this to a large scale web site?

In the past I also used XSLT for large sites, and it didn't seem to slow things down. This is somehow similar to XSLT, but manual.

I would guess pretty costly, but the best way is to test it yourself and measure the peak memory usage and time required to run the script.

You might be able to cache some intermediate state, so that the heavy XML parsing doesn't have to be done everytime - maybe you could replace the tags with actual PHP code like Smarty does and then include that generated/cached PHP file instead.

The cached file could look like the code in Soulmerge's answer.

Is this very costly or is it acceptable?

Don't guess. Measure .

Parsing the xml should be fast, as long as you use builtin functions like DOMXPath and your xml files are not too large.

However, I would rather replace the custom tags with function-calls and include the file in php, which should be a lot faster, since you're not doing any string manipulation in PHP then:

<?xml version="1.0"?>
<html>
    <head>
        <title>The Title</title>
    </head>
    <body>
        <?php textbox('txtUsername') ?>
    </body>
</html>

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM