简体   繁体   English

解析 xml 并编写带有标题列的 csv

[英]Parse xml and write a csv with header columns

I am trying to parse an xml file containing meteo data and to write some value in a csv file.我正在尝试解析包含气象数据的 xml 文件并在 csv 文件中写入一些值。 I'm not sure that this code is elegant but it works.我不确定这段代码是否优雅,但它有效。

from qgis.PyQt.QtCore import *
import requests
import xml.etree.ElementTree as ET
# url of xml to parse
baseUrl = ('http://www.arpa.veneto.it/bollettini/meteo/h24/img08/0144.xml')
resp = requests.get(baseUrl)
msg = resp.content
tree = ET.fromstring(msg)
for stazione in tree.iter('STAZIONE'):
    idstaz= stazione.find('IDSTAZ').text
    for sensore in stazione.iter('SENSORE'):
      id= sensore.find('ID').text  
      for dati in sensore.iter('DATI'):
        ist = dati.get('ISTANTE') 
        vm = dati.find('VM').text
        f = open('D:/GIS/_Temp/result.csv', 'a')
        print >> f, idstaz, id, ist, vm
        f.close()

I'm not sure that this code is elegant but it works.我不确定这段代码是否优雅,但它有效。

144 300000864 201701080100 -4.2
144 300000864 201701080200 -4.5
144 300000864 201701080300 -4.8
144 300000864 201701080400 -5.5
...

but I don't know how to add the headers to the columns.但我不知道如何将标题添加到列中。

Open the file before the for loop and add header to file在 for 循环之前打开文件并将头文件添加到文件中

from qgis.PyQt.QtCore import *
import requests
import xml.etree.ElementTree as ET
# url of xml to parse
baseUrl = ('http://www.arpa.veneto.it/bollettini/meteo/h24/img08/0144.xml')
resp = requests.get(baseUrl)
msg = resp.content
tree = ET.fromstring(msg)
f = open('D:/GIS/_Temp/result.cvs', 'a')
f.write('STAZIONE,IDSTAZ,SENSORE,ISTANTE')
for stazione in tree.iter('STAZIONE'):
    idstaz= stazione.find('IDSTAZ').text
    for sensore in stazione.iter('SENSORE'):
      id= sensore.find('ID').text  
      for dati in sensore.iter('DATI'):
        ist = dati.get('ISTANTE') 
        vm = dati.find('VM').text

        print >> f, idstaz, id, ist, vm

f.close()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM