[英]loop through dirnames list.remove() is not working
我試圖遍歷多個目錄和多個GDB以創建要素類列表。 我遇到的問題是,當我嘗試從列表中刪除某些要素類時,腳本僅被忽略,或者我收到一條錯誤消息,指出x在list。(x)中不存在。 要素類名稱的問題在於,每個GDB都有3個字母,而三個G字母都是標准的。
像這樣:
directory1> directory1.gdb>形狀> fc_dir1_feature
和
directory2> directory2.gdb>形狀> fc_dir2_feature
等等...
我在用
for dirpath, dirnames, filenames in arcpy.da.Walk(in_workspace, datatype="FeatureClass",type="Polygon"):
if "dir1" in dirnames:
dirnames.remove('dir1')
可以很好地從GDB中刪除要素數據集,並通過擴展將其中的所有要素類刪除。 但是我不能只刪除特定的要素類。
謝謝你的幫助。
假設arcpy.da.Walk的工作方式類似於os.walk(也就是說,從目錄名中刪除目錄會停止步行到該目錄),則應添加另一個for循環來迭代目錄名並應用過濾器。 請注意,我復制了目錄名,以便可以調用remove而不弄亂迭代器。
for dirpath, dirnames, filenames in arcpy.da.Walk(in_workspace, datatype="FeatureClass",type="Polygon"):
# remove subdirectories that match pattern so they will not be walked
for dirname in dirnames[:]:
if 'dir1' in dirname:
dirnames.remove(dirname)
從ArcGIS Resources爬取,可以使用正則表達式以幾種不同的方式過濾文件名。 以下是正則表達式的示例,該正則表達式刪除通配符插槽中具有“ abc”,“ def”或“ ghi”的文件:
import arcpy
import os
import re
workspace = "c:/data"
feature_classes = []
# i dont like abc, def or ghi features so I have a regex to match them
filter_classes_re = re.compile('fc_(abc|def|ghi)_feature$')
for dirpath, dirnames, filenames in arcpy.da.Walk(workspace,
datatype="FeatureClass",
type="Polygon"):
for filename in filenames:
# only add to feature list if it doesn't match the bad guys
if filter_classes_re.match(filename) is None:
feature_classes.append(os.path.join(dirpath, filename))
# alternately, i could extract the wildcard part and compare it outside
# of the regex ... but this will be slower
filter_classes_re = re.compile('fc_(.*?)_feature$')
for dirpath, dirnames, filenames in arcpy.da.Walk(workspace,
datatype="FeatureClass",
type="Polygon"):
for filename in filenames:
# extract the wildcard part
match = filter_classes_re.match(filename)
if match:
matched = match.group(1)
else:
matched = ''
if matched not in ('abc', 'def', 'ghi'):
feature_classes.append(os.path.join(dirpath, filename))
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.