简体   繁体   中英

Global variables through multithreading python

I have a file which I am monitoring, since I need two fields from it, the thing is that I have run into the problem that with the multithreading library I cannot share global variables, that is, I modify them in the main and in the processes they do not come out those modifications, I've been investigating and I can't find a solution. The child processes that you have running must see in real time that modification that was made in the main

WH = ''
delay = 0.0


def comprobar():
    print(WH)
    print(delay)


if __name__ == "__main__":

    while True: 

        with open("configUsers.json") as archivo:
            data = json.load(archivo)
            WH = data['WH']
            delay = data['delay']
            thread = multiprocessing.Process(name="hilo1", target=comprobar, args=())
            thread.start()
            time.sleep(0.5)

What OS are you running on? MacOS/Windows or Linux. I'm guessing the former.

Python has two different ways of starting a subprocess.

In one case ("forking") it makes an exact copy of the world, and then calls the function. Forking is the default on Linux. In this world, you'd see WH and delay with their updated values.

In the other case ("spawning"), it starts a fresh Python image and reloads the file, but doesn't execute the code inside if __name__ == "__main__" . This is the default on Windows and MacOs. In this case, your code would still see the values as "" and 0.0 .

In your case, you should just be passing WH and delay as arguments to comprobar() . I expect you're actually trying to do something more complicated, but it's hard to tell from this limited example.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM