[英]I have a CSV file with many columns and many rows. How do I create a one column one Excel sheet from Python?
[英]excel - Create a many-to-one sheet from a one-to-many sheet
我在Excel中有一张工作表,类似:
| Value1 | Data1 | Data1b | 1,3,4,8 |
| Value2 | Data2 | Data2b | 2 |
| Value3 | Data3 | Data3b | 6,7,8 |
我想拿一张纸,然后再做一张纸,将最后一列分成单独的行,并使其他数据保持同步。 因此,当第一张纸被更新时,第二张纸也被更新。 并且,如果将数字添加到第一张工作表的最后一列,则将新行添加到第二张工作表。
第二张纸应该看起来像这样:
| Value1 | Data1 | Data1b | 1 |
| Value2 | Data2 | Data2b | 2 |
| Value1 | Data1 | Data1b | 3 |
| Value1 | Data1 | Data1b | 4 |
| Value3 | Data3 | Data3b | 6 |
| Value3 | Data3 | Data3b | 7 |
| Value1 | Data1 | Data1b | 8 |
| Value3 | Data3 | Data3b | 8 |
更新 :以下是我尝试使用的代码。 首先,我最好的方法是什么? 是否正在清理然后重新填充更新第二张工作表的正确方法? 最后,当一个人更新第一张图纸时,如何使它自动运行?
更新 :唯一仍然不起作用的是最后的排序,有人知道为什么吗?
Private FROM_SHEET As String
Private TO_SHEET As String
Private START_ROW As Long
Private NUM_COL As Long
Sub oneToMany()
FROM_SHEET = "Sheet1"
TO_SHEET = "Sheet2"
START_ROW = 2
NUM_COL = 4
Dim fromSheet As Worksheet
Dim toSheet As Worksheet
Dim newRow As Long
Set fromSheet = Sheets(FROM_SHEET)
Set toSheet = Sheets(TO_SHEET)
toSheet.UsedRange.ClearContents
newRow = START_ROW
For i = START_ROW To fromSheet.Cells(fromSheet.Rows.Count, 1).End(xlUp).Row
Dim col As String
Dim nums() As String
col = fromSheet.Cells(i, NUM_COL)
nums = Split(col, ",")
For Each num In nums
fromSheet.Rows(i).Copy toSheet.Rows(newRow)
toSheet.Cells(newRow, NUM_COL) = Trim(num) 'Should copy then overwrite?
newRow = newRow + 1
Next num
Next
'Sort not working
toSheet.Range(toSheet.Cells(START_ROW, START_COL), toSheet.Cells(lastRow, lastCol)).Sort _
key1:=toSheet.Range(toSheet.Cells(START_ROW, NUM_COL), toSheet.Cells(lastRow, NUM_COL)), _
order1:=xlAscending, Header:=xlNo
End Sub
在回答您的第一个问题时,如果这只是表单上的唯一数据,那么您的clearcontents方法就可以了。
要自动运行,请查看Sheet1上的Worksheet_Change事件。 您可以根据更改的单元格来定位更改。
Private Sub Worksheet_Change(ByVal Target As Range)
If Target.Row < 4 And Target.Column < 5 Then
Call oneToMany
End If
结束子
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.