简体   繁体   中英

What is the best way to create scalable dynamic xml for external sites to read?

Currently we create huge xml files for several hundreds of users and each of the xml file contains thousands of lines and requires a lot of ram to create/read them.

We wanted to create a way of providing dynamic xml files for other sites to read but it would normally kill our servers because they would constantly request the xml files and our server wasn't capable of delivering so much data in a short amount of time. After seeing this, we opted to create static xml files and store them on a s3 bucket for the sites to read whenever they wanted to and our servers wouldn't need to take the huge blow of the requests. This works, however we're also seeing the limits of our solution. We're storing thousands of xml with exactly the same fields except for a few of them that changes based on the user data.

Is there a way of creating a skeleton xml that allows us to have dynamic values, not make us unnecessarily store the xmls, and does not hugely burden our servers? What would be the best/most scalable way of approaching this problem? Open to trying new architectures or frameworks to test out solutions.

How about using XML entities?

You can define your XML to use entity references for the variable parts

<user>&username;</user>

and define the values of the entities in an external DTD. Your XML parser will expand the entity references while parsing. Most XML parsers have some way of allowing the application to redirect the DTD filename that's hard-coded in the XML to a file that's dynamically supplied by the application.

DTDs and entities are a bit out of fashion, to the extent that some XML parsers don't support them, but they meet this requirement pretty well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM