简体   繁体   English

split('@') 中的@ 除了作为字符串分隔符之外还有什么作用?

[英]What does the @ in split('@') do besides from being a string separator?

I am using the Coral dev board to accelerate AI models.我正在使用 Coral 开发板来加速 AI 模型。 I do not understand what does the '@' mean.我不明白'@'是什么意思。

split returns a list of all the words in a string using an specified 'separator'. split使用指定的“分隔符”返回字符串中所有单词的列表。 But the name of my model file does not have an '@'.但是我的模型文件的名称没有“@”。

It seems it is assigning a delegate to the model file.它似乎正在为模型文件分配一个委托。

The name of the model = mobilenet_v2_1.0_224_quant_edgetpu.tflite模型名称= mobilenet_v2_1.0_224_quant_edgetpu.tflite

import argparse
import time

from PIL import Image

import classify
import tflite_runtime.interpreter as tflite
import platform

EDGETPU_SHARED_LIB = {
  'Linux': 'libedgetpu.so.1',
  'Darwin': 'libedgetpu.1.dylib',
  'Windows': 'edgetpu.dll'
}[platform.system()]

def make_interpreter(model_file):
  model_file, *device = model_file.split('@')
  return tflite.Interpreter(
      model_path=model_file,
      experimental_delegates=[
          tflite.load_delegate(EDGETPU_SHARED_LIB,
                               {'device': device[0]} if device else {})
      ])

Thank you谢谢

The argument to string.split() is just a separator. string.split()的参数只是一个分隔符。 It doesn't do anything else.它不做任何其他事情。 If the separator doesn't appear in the string, then a singleton list gets returned: [string] .如果分隔符未出现在字符串中,则返回一个单例列表: [string]

model_file, *device = model_file.split('@')

expects for model_file.split('@') to return a list, and assigns the first element of that list to model_file , and all subsequent elements to device (that's what the list-unpacking operator * does in this context).期望model_file.split('@')返回一个列表,并将该列表的第一个元素分配给model_file ,并将所有后续元素分配给device (这就是列表解包运算符*在此上下文中所做的)。

If, as in this case, model_file.split('@') would return a list with only one element, then device would be an empty list [] after this line executes.在这种情况下,如果model_file.split('@')将返回一个只有一个元素的列表,那么在此行执行后device将是一个空列表[]

Apologies for the downvotes, I'm Nam from google-coral team here, the downvotes are from stackoverflow users and not us.对于投反对票,我是来自 google-coral 团队的 Nam,投反对票的是来自 stackoverflow 的用户,而不是我们。 You do have a solid question, and I second @Green Cloak Guy's answer, however to further expand this:你确实有一个可靠的问题,我第二个@Green Cloak Guy的回答,但要进一步扩展这个:

In our documentation for using multiple tpus with the tflite API, you can specify which device you want to load this model on: https://coral.ai/docs/edgetpu/multiple-edgetpu/#using-the-tensorflow-lite-python-api Basically, if you have 2 pcie devices and 2 usb devices, it'll be represented by tflite like this:在我们通过 tflite API 使用多个 tpu 的文档中,您可以指定要在哪个设备上加载此模型: https ://coral.ai/docs/edgetpu/multiple-edgetpu/#using-the-tensorflow-lite- python-api基本上,如果您有 2 个 pcie 设备和 2 个 USB 设备,它将由 tflite 表示如下:

pci:0
pci:1
usb:0
usb:1

I guess this part is not well documented and requires putting @Green's answer regarding python and our documentation together.我想这部分没有很好的文档记录,需要将@Green 关于 python 的答案和我们的文档放在一起。 However, when you run the demo, instead of just giving the model path, you can also appends which devices you want this model to be ran on, for instance:但是,当您运行演示时,您不仅可以提供模型路径,还可以附加您希望在哪些设备上运行此模型,例如:

python3 classify_image.py \
  --model models/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite@pci:0 \
  --labels models/inat_bird_labels.txt \
  --input images/parrot.jpg

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM