简体   繁体   中英

High CPU usage when using multicast with Python

I'm programming a little multi-protocol image streaming server (in Python), and all protocols work well enough, except for the Multicast protocol that makes my CPU usage go up to 150% !

Here's the multicast code:

       delay = 1./self.flux.ips
    imgid = 0
    lastSent = 0

    while self.connected:

        #self.printLog("Getting ready to fragment {}".format(imgid))
        fragments = fragmentImage(self.flux.imageFiles[imgid], self.fragmentSize)
        #self.printLog("Fragmented {} ! ".format(imgid))

        # Checking if the delay has passed, to respected the framerate
        while (time.time() - lastSent) < delay:
            pass

        # Sending the fragments
        for fragmentid in range(len(fragments)):
            formatedFragment = formatFragment(fragments[fragmentid], fragmentid*self.fragmentSize, len(self.flux.imageFiles[imgid]), imgid)
            self.sendto(formatedFragment, (self.groupAddress, self.groupPort))

        lastSent = time.time()

        imgid = (imgid + 1) % len(self.flux.images)

The UDP protocol also sends images as fragments, and I don't have any CPU usage problems. Note that the client also have some latency to get those images.

Use time.sleep(delay) instead of the (heavy) busy waiting and you should be good (see this question Python: Pass or Sleep for long running processes? ).

For an even better performance you should consider an I/O event reactor like PyUV , gevent , tornado or twisted .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM