简体   繁体   English

将键/值添加到哈希(Perl)

[英]Adding Key/Value To Hash(Perl)

Hi I'm trying to add multiple keys/values into a hash. 嗨,我正在尝试将多个键/值添加到哈希中。 Basically file names and their data. 基本上是文件名及其数据。 Each of the json file's content have hash and array referencing. 每个json文件的内容都有哈希和数组引用。 The hash that contains the file names and data will be handled else where. 包含文件名和数据的哈希将在其他位置处理。

This is my code: 这是我的代码:

 sub getDecode {
    my $self = shift;
    my @arrUrls = ('http://domain.com/test.json', 'http://domain.com/test_two.json', 'http://domain.com/test3.json');

    my $resQueue = Thread::Queue->new();
    my $intThreads = 10;
    my @arrThreads = ();
    my %arrInfo = ();

    foreach my $strUrl (@arrUrls) {
          for (1..$intThreads) {
              push (@arrThreads, threads->create(sub {
                          while (my $resTask = $resQueue->dequeue) {
                                      my $resData = get($strUrl);
                                      my $strName = basename($strUrl, '.json');
                                      my $arrData = decode_json($resData);
                                      $arrInfo{$strName} = $arrData;
                          }
              }));
          }
   }

   $resQueue->enqueue(@arrUrls);
   $resQueue->enqueue(undef) for 1..$intThreads;
   $_->join for @arrThreads;

  return %arrInfo;       
  }

When I try data dumping on %arrInfo no output is given. 当我尝试对%arrInfo数据转储时,没有给出输出。 Please help! 请帮忙!

You're multithreading and not sharing the variable. 您是多线程的,没有共享变量。 When a thread spawns, the existing variable-space is cloned - so each thread has it's own local copy of %arrInfo which is discarded when it exits. 当线程产生时,将克隆现有的变量空间-因此每个线程都有它自己的%arrInfo本地副本,该副本在退出时将被丢弃。

You need to: 你需要:

use threads::shared;
my %arrInfo : shared;

You're also doing something a bit odd with your thread spawn loop - you're spawning 10 threads x 3 URLs - for 30 threads, but only queuing 3 URLs to process. 您还在线程生成循环中做一些奇怪的事情-您生成10个线程x 3个URL-用于30个线程,但仅排队3个URL进行处理。 But then you're also not actually using $resTask at all, which doesn't make a lot of sense. 但是然后您实际上根本就没有使用$resTask ,这没有任何意义。

So I'm prepared to bet your code hangs at the end, because you're trying to join some threads that aren't complete. 因此,我敢打赌您的代码最终会挂起,因为您正试图加入一些未完成的线程。

You might find $resQueue -> end() is more suitable than queuing up undef . 您可能会发现$resQueue -> end()比排队undef更合适。

Example using a shared hash: 使用共享哈希的示例:

use strict;
use warnings;
use threads;
use threads::shared;

use Data::Dumper;

my %test_hash : shared;
my %second_hash;

$test_hash{'from_parent'}       = 1;
$second_hash{'from_parent_too'} = 1;

threads->create(
    sub {
        $test_hash{'from_first'}       = 2;
        $second_hash{'from_first_too'} = 2;
    }
);
threads->create(
    sub {
        $test_hash{'from_second'}       = 3;
        $second_hash{'from_second_too'} = 3;
    }
);

foreach my $thr ( threads->list() ) { $thr->join }

print Dumper \%test_hash;
print Dumper \%second_hash;

For a 'worker threads' style approach, I'd offer: Perl daemonize with child daemons 对于“工人​​线程”风格的方法,我将提供: Perl用子守护程序守护

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM