簡體   English   中英

在不使用PuTTy / SSH的情況下通過Python啟動Hadoop MapReduce作業

[英]Launch Hadoop MapReduce job via Python without PuTTy/SSH

我通過通過PuTTy登錄SSH來運行Hadoop MapReduce作業,這要求我在PuTTY中輸入主機名/ IP地址,登錄名和密碼,以獲取SSH命令行窗口。 在SSH控制台窗口中,然后提供適當的MR命令,例如:

hadoop jar /usr/lib/hadoop-0.20-mapreduce/contrib/streaming/hadoop-streaming-2.0.0-mr1-cdh4.0.1.jar -file /nfs_home/appers/user1/mapper.py -file / nfs_home / appers /user1/reducer.py -mapper'/usr/lib/python_2.7.3/bin/python mapper.py'-reducer'/usr/lib/python_2.7.3/bin/python reducer.py'-input / ccexp / data / test_xml / 0901282-510179094535002-oozie-oozi-W / extractOut / / .xml -output / user / ccexptest / output / user1 / MRoutput

我想做的是使用Python更改這個笨拙的過程,這樣我就可以從Python腳本中啟動MapReduce作業,而不必通過PuTTy登錄SSH。

可以這樣做,如果可以,有人可以告訴我如何做嗎?

我用以下腳本解決了這個問題:

import paramiko

# Define connection info
host_ip = 'xx.xx.xx.xx'
user = 'xxxxxxxx'
pw = 'xxxxxxxx'

# Paths
input_loc = '/nfs_home/appers/extracts/*/*.xml'
output_loc = '/user/lcmsprod/output/cnielsen/'
python_path = "/usr/lib/python_2.7.3/bin/python"
hdfs_home = '/nfs_home/appers/cnielsen/'
output_log = r'C:\Users\cnielsen\Desktop\MR_Test\MRtest011316_0.txt'

# File names
xml_lookup_file = 'product_lookups.xml'
mapper = 'Mapper.py'
reducer = 'Reducer.py'
helper_script = 'Process.py'
product_name = 'test1'
output_ref = 'test65'

# ----------------------------------------------------

def buildMRcommand(product_name):
    space = " "
    mr_command_list = [ 'hadoop', 'jar', '/share/hadoop/tools/lib/hadoop-streaming.jar',
                        '-files', hdfs_home+xml_lookup_file,
                        '-file', hdfs_home+mapper,
                        '-file', hdfs_home+reducer,
                        '-mapper', "'"+python_path, mapper, product_name+"'",
                        '-file', hdfs_home+helper_script,
                        '-reducer', "'"+python_path, reducer+"'",
                        '-input', input_loc,
                        '-output', output_loc+output_ref]

    MR_command = space.join(mr_command_list)
    print MR_command
    return MR_command

# ----------------------------------------------------

def unbuffered_lines(f):
    line_buf = ""
    while not f.channel.exit_status_ready():
        line_buf += f.read(1)
        if line_buf.endswith('\n'):
            yield line_buf
            line_buf = ''

# ----------------------------------------------------

client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(host_ip, username=user, password=pw)

# Build Commands
list_dir = "ls "+hdfs_home+" -l"
getmerge = "hadoop fs -getmerge "+output_loc+output_ref+" "+hdfs_home+"test_011216_0.txt"

# Run Command
stdin, stdout, stderr = client.exec_command(list_dir)
##stdin, stdout, stderr = client.exec_command(buildMRcommand(product_name))
##stdin, stdout, stderr = client.exec_command(getmerge)

print "Executing command..."
writer = open(output_log, 'w')

for l in unbuffered_lines(stderr):
    e = '[stderr] ' + l
    print '[stderr] ' + l.strip('\n')
    writer.write(e)

for line in stdout:
    r = '[stdout]' + line
    print '[stdout]' + line.strip('\n')
    writer.write(r)

client.close()
writer.close()

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM