简体   繁体   中英

How to import NLP model (facebook bart large mnli model) in Julia?

I would like to seek help in importing the bart-large-mnli model for zero-shot-classification in Julia?

Reference to the model: https://metatext.io/models/facebook-bart-large-mnli

This is the python example which I want to port to Julia:

from transformers import pipeline
classifier = pipeline("zero-shot-classification",
                      model="facebook/bart-large-mnli")
sequence_to_classify = "one day I will see the world"
candidate_labels = ['travel', 'cooking', 'dancing']
classifier(sequence_to_classify, candidate_labels)

Expected Output:

{'sequence': 'one day I will see the world', 
 'labels': ['travel', 'dancing', 'cooking'], 
 'scores': [0.9938650727272034, 0.0032738070003688335, 0.002861041808500886]
}

Please advise or suggest a solution for this scenario. Look forward to the responses. Thanks!

Not sure quite what your desired use case is, but if you want to just have access to the pretrained huggingface model output in your Julia code, you can use PyCall.jl to call that Python code and return the dictionary you're interested in.

That is, in Julia, run the python code in py"""..."""

julia> py"""
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
                      model="facebook/bart-large-mnli")
sequence_to_classify = "one day I will see the world"
candidate_labels = ['travel', 'cooking', 'dancing']
output = classifier(sequence_to_classify, candidate_labels)
"""

then the Python global variable output will be accessible with py"output" in Julia (the Python dict automatically converted to a Julia Dict ), like

julia> py"output"
Dict{Any, Any} with 3 entries:
  "scores"   => [0.993865, 0.00327379, 0.00286104]
  "sequence" => "one day I will see the world"
  "labels"   => ["travel", "dancing", "cooking"]

you can also get it as a PyObject without the automatic type conversion, by putting o after the string:

julia> py"output"o
PyObject {
  'sequence': 'one day I will see the world', 
  'labels': ['travel', 'dancing', 'cooking'], 
  'scores': [0.9938650727272034, 0.0032737923320382833, 0.002861042506992817]
}

You could also get something similar by importing the python package transformers into Julia with PyCall's pyimport() , and

using PyCall
transformers = PyCall.pyimport("transformers")
classifier = transformers.pipeline("zero-shot-classification", model="facebook/bart-large-mnli")
sequence_to_classify = "one day I will see the world"
candidate_labels = ["travel", "cooking", "dancing"]
output = classifier(sequence_to_classify, candidate_labels)

now the Julia object output will be the Dict you want.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM