[英]Removing Columns From A Pipe Delimited File C#
I have a pipe delimited file with the following columns我有一个包含以下列的 pipe 分隔文件
Col0|Col1|Col2|Col3|Col4|Col5|Col6
data|data|data|data|data|data|data
data|data|data|data|data|data|data
data|data|data|data|data|data|data
data|data|data|data|data|data|data
I want to remove columns from a pipe delimited file.我想从 pipe 分隔文件中删除列。
//THIS code will remove Col2
string[] csvLines = File.ReadAllLines(@"Input.txt");
string header = csvLines.FirstOrDefault(l => !String.IsNullOrWhiteSpace(l));
if (header != null)
{
IEnumerable<string> allButWantedCols=null;
allButWantedCols = csvLines
.Select(l => new { Columns = l.Split(new[] { '|' }, StringSplitOptions.None) })
.Where(x => x.Columns.Length > 2)
.Select(x => string.Join("|", x.Columns
.Where((col, index) => index != 2)
.Select(col => col.Trim())));
// rewrite the file with all columns but balance:
File.WriteAllLines(@"pipe_OUTPUT_.txt", allButWantedCols);
}
The code above removes "Col2" and all of its data.上面的代码删除了“Col2”及其所有数据。
I am struggling to figure out how to remove multiple columns?我正在努力弄清楚如何删除多列?
ie remove columns Col2,Col3,Col6即删除列 Col2、Col3、Col6
This is your current code:这是您当前的代码:
allButWantedCols = csvLines
.Select(l => new { Columns = l.Split(new[] { '|' }, StringSplitOptions.None) })
.Where(x => x.Columns.Length > 2)
.Select(x => string.Join("|", x.Columns
.Where((col, index) => index != 2)
.Select(col => col.Trim())));
Change it to something along the lines of the below.将其更改为以下内容。
allButWantedCols = csvLines
.Select(l => new { Columns = l.Split(new[] { '|' }, StringSplitOptions.None) })
.Where(x => x.Columns.Length > 2)
.Select(x => string.Join("|", x.Columns
.Where((col, index) => index != 2 && index != 3 && index != 6)
.Select(col => col.Trim())));
You will potentially want to ensure that columns exist at said indexes first, but that should provide you what you've asked about in your initial question.您可能希望首先确保所述索引中存在列,但这应该为您提供您在初始问题中询问的内容。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.