I am constantly getting this error " ValueError: not enough values to unpack (expected 2, got 1)in for(word,tag) in grp:"
This what I have tried # - - coding: utf-8 - - import nltk import itertools import ast import collections import sys import re import time
f=open('test.txt','r')
text1=f.read()
text2=text1.rstrip()
text3=text2.strip()
#text3=tuple(text1)
#print(text3)
print("text3")
train_data=text3
print(train_data)
f=open('test1.txt','r')
text5=f.read()
#text6=text5.splitlines()
text6=text5.strip()
text7=text6.rstrip()
orig_data=text7
So, train_data
is just a str. Notice how you read it from a file and don't do anything to change it into code. If you want to confirm, call print(type(train_data))
. You'll get <class 'str'>
.
You can iterate through a str
with a for loop, that's why your first loop works, but in your second loop, you're just looping over the characters in the original str.
If you want to use it as actual data, you must parse it and turn it into a Python data structure. DO NOT USE EVAL FOR THIS. Instead use the ast
library (safer and more stable in case of mistakes in your data):
import ast
# … later …
train_data = ast.literal_eval(text3)
Then go on and use train_data
as you're using it.
f=open('C://Users//DELL//Desktop//test.txt','r')
text1=f.read()
text2=text1.rstrip()
text3=text2.strip()
#text3=tuple(text1)
#print(text3)
print("text3")
train_data=text3
print(train_data)
f=open('C://Users//DELL//Desktop//test1.txt','r')
text5=f.read()
#text6=text5.splitlines()
text6=text5.strip()
text7=text6.strip()
orig_data=text7
print(orig_data)
orgword=[]
orgtags=[]
#orig_data=train_data
#print("original data")
#print(orig_data)
for grp in train_data:
for word in grp:
orgword.append(word)
#orgtags.append(tag)
print("Original Words")
print(orgword)
print("Original Tags")
print(orgtags)
#fix it...!!!(U get word in test)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.