简体   繁体   中英

I want to extract coordinates from KML file

I am trying to extract coordinates from kml file in python but it is giving me the error. below are my code and kml file.

Code:

from pykml import parser

root = parser.fromstring(open('task_2_sensor.kml', 'r').read())
print (root.Document.Placemark.Point.coordinates)

Error:

ValueError: Unicode strings with encoding declaration are not supported. Please use bytes input or XML fragments without declaration.

KML File:

                        <coordinates>
                            13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.323018,52.499687,0 13.310096,52.4893,0 13.310096,52.4893,0 13.309909,52.48929,0 13.309909,52.48929,0 13.309753,52.489235,0                            
                        </coordinates>
                    </LineString>
                </Placemark>
                                        
        </Folder>
                
        <LookAt>
            
        </LookAt>
    </Document>
</kml>

If source KML has an explicit encoding then you must either remove the XML declaration line from the KML or use parse() not fromstring() .

Use this form to parse KML file using pykml.

with open('task_2_sensor.kml', 'r') as f:
    root = parser.parse(f).getroot()
print(root.Document.Placemark.Point.coordinates)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM