简体   繁体   English

使 tensorflow 在 nodejs 服务上更快?

[英]Make tensorflow faster on nodejs serve?

I've created a code to detect the objects on an image with tensorflow js but it is really slow.我已经创建了一个代码来使用 tensorflow js 检测图像上的对象,但它真的很慢。 For that i installed npmjs packages:为此,我安装了 npmjs 包:

@tensorflow/tfjs
@tensorflow-models/coco-ssd
@tensorflow-models/mobilenet
get-image-data

And this is my script:这是我的脚本:

const tf = require('@tensorflow/tfjs')
// Load the binding (CPU computation)
const mobilenet = require('@tensorflow-models/mobilenet');
const cocoSsd = require("@tensorflow-models/coco-ssd");
const events = require('events');
const post_event = new events.EventEmitter();
const start = Date.now()
// for getting the data images
const image = require('get-image-data')

image('./img/cup.jpg',async(err, image)=>{
    const numChannels = 3;
    const numPixels = image.width * image.height;
    const values = new Int32Array(numPixels * numChannels);
    pixels = image.data
    for(let i = 0; i < numPixels; i++) {
        for (let channel = 0; channel < numChannels; ++channel) {
            values[i * numChannels + channel] = pixels[i * 4 + channel];
        }
    }
    const outShape = [image.height, image.width, numChannels];
    const input = tf.tensor3d(values, outShape, 'int32');
    await load(input)
});

const load=async img=>{
    console.log("IMG LOADED in ", (Date.now()-start)/1000,"s")
    let mobilenet_ = cocossd_ = false, post_predictions = []; 

    mobilenet.load().then(async model=>{
        console.log("mobilenet loaded in ",(Date.now()-start)/1000,"s")
        model.classify(img).then(async classify_predictions=>{
            for(i=0;i<classify_predictions.length;i++){
                const element = classify_predictions[i];
                const each_class = element["className"].split(", ")
                each_class.forEach(this_element=>{
                    post_predictions.push([this_element, (element.probability*100)]);
                })
            }
            post_event.emit("mobilenet")
        });        
    });

    cocoSsd.load().then(async model=>{
        console.log("cocossd loaded in ",(Date.now()-start)/1000,"s")
        model.detect(img).then(predictions=>{
            for(i=0;i<predictions.length;i++){
                const this_element = predictions[i];
                post_predictions.unshift([this_element.class, (this_element.score*100)]);
            }
            post_event.emit("cocossd")
        });
    })

    post_event.on("mobilenet", ()=>{
        console.log("mobilenet(longest) finished in ", (Date.now()-start)/1000,"s", post_predictions)
        mobilenet_=true
        if(mobilenet_ && cocossd_){
            post_event.emit("finish")
        }
    }).on("cocossd", ()=>{
        console.log("cocossd finished in ", (Date.now()-start)/1000,"s", post_predictions)
        cocossd_ = true
        if(mobilenet_ && cocossd_){
            post_event.emit("finish")
        }
    }).on("finish", ()=>{
        post_predictions.sort((a, b)=>{
            return b[1]-a[1];
        });
        console.log("Post in ", (Date.now()-start)/1000,"s", post_predictions)
    })
}

This works but when i run it, it's really slow, here are the results:这有效,但是当我运行它时,它真的很慢,结果如下:

IMG LOADED in  0.486 s
cocossd loaded in  6.11 s
cocossd finished in  9.028 s [ [ 'cup', 95.68768739700317 ] ]
mobilenet loaded in  10.845 s
mobilenet(longest) finished in  12.795 s [
  [ 'cup', 95.68768739700317 ],
  [ 'cup', 69.30274367332458 ],
  [ 'espresso', 17.099112272262573 ],
  [ 'coffee mug', 13.384920358657837 ]
]
Post in  12.809 s [
  [ 'cup', 95.68768739700317 ],
  [ 'cup', 69.30274367332458 ],
  [ 'espresso', 17.099112272262573 ],
  [ 'coffee mug', 13.384920358657837 ]
]

I've watched some videos and they say that the nodejs version of mobilenet takes 20ms to have the results.我看过一些视频,他们说 nodejs 版本的 mobilenet 需要 20 毫秒才能获得结果。 But on my app, it takes 10s.但在我的应用程序上,它需要 10 秒。 Maybe i did something wrong.也许我做错了什么。 Can someone help me to fix this problem?有人可以帮我解决这个问题吗?

Thanks谢谢

Loading the model takes some time.加载模型需要一些时间。 For example, you could create an express server expecting an image and do object detection.例如,您可以创建一个需要图像的快速服务器并进行对象检测。 When you start the Server the models can be pre loaded.当您启动服务器时,可以预加载模型。 On each api request, the models are already loaded and detection is done in milliseconds (hopefully:-))在每个 api 请求上,模型已经加载并且检测在毫秒内完成(希望是:-))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM