繁体   English   中英

Xamarin OpenEars本机绑定在设备上不起作用,但在模拟器上起作用

[英]Xamarin OpenEars Native Binding Not working on Device but works on Simulator

我一直在xamarin iOS绑定项目中使用OpenEars v2.03 iOS框架项目。 让我解释一下到目前为止我做了什么。我是XCode,Xamarin和所有这些Binding的新手。 这将是一个大问题,请屏住呼吸……

1)在Xcode for Simulator中构建OpenEars框架项目。 Framework / OpenEars.framework / Versions / Current /复制了“ OpenEars”文件,并将其重命名为“ libOpenEars-i386.a

同样,通过将设备连接到Mac并选择目标到我的iPhone,为iPhone 4s设备构建相同的库。 最后复制生成的OpenEars,并将其重命名为“ libOpenEars-armv7.a

2)使用lipo命令,使用以下命令将两个文件(libOpenEars-i386.a,libOpenEars-armv7.a)捆绑为一个文件“ libOpenEars.a”。

lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a 

3)在Xamarin Studio中创建一个Binding项目并添加libOpenEars.a,它自动生成一个libOpenEars.linkwith.cs 以下是以下代码,

using System;
using ObjCRuntime;

[assembly: LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]

我尝试过更改Liker标志LinkerFlags =“ -lstdc ++ -lc ++ -ObjC”和SmartLink = false。

4)我的ApiDefinition文件包含OpenEars的所有接口,我在这里仅添加了一个接口。

[BaseType(typeof(NSObject))]
[Protocol]
interface OEEventsObserver
{
    [Wrap ("WeakDelegate")]
    OEEventsObserverDelegate Delegate { get; set; }

    [Export ("delegate", ArgumentSemantic.Assign), NullAllowed]
    NSObject WeakDelegate { get; set; }
}

5)将OpenEars.dll引用到我的iOS示例项目。

6)在绑定库本身中添加语言模型和声学模型。 (即使不需要动态语言模型生成,我也使用了OpenEars Xamarin git中的旧OpenEars示例项目,没有使用新的DynamicLanguageModel生成器,而是修改了示例以进行最新更改)。

视图控制器:

public partial class OpenEarsNewApiViewController : UIViewController
{
    OEEventsObserver observer;
    OEFliteController fliteController;
    OEPocketsphinxController pocketSphinxController;


    String pathToLanguageModel;
    String pathToDictionary;
    String pathToAcousticModel;

    String firstVoiceToUse;
    String secondVoiceToUse;

    static bool UserInterfaceIdiomIsPhone {
        get { return UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone; }
    }

    public void init()
    {
        try
        {
            observer = new OEEventsObserver();
            observer.Delegate = new OpenEarsEventsObserverDelegate (this);
            pocketSphinxController = new OEPocketsphinxController ();

            fliteController = new OEFliteController();

            firstVoiceToUse = "cmu_us_slt";
            secondVoiceToUse = "cmu_us_rms";

            pathToLanguageModel = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.languagemodel";
            pathToDictionary = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.dic";
            pathToAcousticModel = NSBundle.MainBundle.ResourcePath;
        }
        catch(Exception e) {
            Console.WriteLine ("Exception Message :"+e.Message);
            Console.WriteLine ("Inner Exception Mesage :"+e.InnerException.Message);
        }

    }

    public OpenEarsNewApiViewController (IntPtr handle) : base (handle)
    {
        init ();
    }

    #region Update

    public void UpdateStatus (String text)
    {
        txtStatus.Text = text;
    }

    public void UpdateText (String text)
    {
        txtOutput.Text = text;
    }

    public void UpdateButtonStates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)
    {
        btnStartListening.Hidden = hidden1;
        btnStopListening.Hidden = hidden2;
        btnSuspend.Hidden = hidden3;
        btnResume.Hidden = hidden4;
    }

    public void Say (String text)
    {
        //fliteController.SaywithVoice (text, secondVoiceToUse);
    }

    public void StartListening ()
    {
        //pocketSphinxController.RequestMicPermission ();
        if (!pocketSphinxController.IsListening) {

            //NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@"%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@",[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@"DMP"];


            pocketSphinxController.StartListeningWithLanguageModelAtPath (
                pathToLanguageModel,
                pathToDictionary,
                pathToAcousticModel,
                false
            );
        } else {
            new UIAlertView ("Notify !!","Already Listening",null,"OK","Stop").Show();

        }

    }

    public void StopListening ()
    {
        //pocketSphinxController.StopListening ();
    }

    public void SuspendRecognition ()
    {
        pocketSphinxController.SuspendRecognition ();
    }

    public void ResumeRecognition ()
    {
        pocketSphinxController.ResumeRecognition ();
    }

    #endregion

    #region Event Handlers

    partial void btnStartListening_TouchUpInside (UIButton sender)
    {
        try
        {
            StartListening();
            //fliteController.Init();
            //Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
            //fliteController.Say("Hai", new OEFliteVoice());

            UpdateButtonStates (true, false, false, true);
            Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
        }
        catch(Exception e)
        {
            Console.WriteLine(e.Message);
        }
    }

    partial void btnStopListening_TouchUpInside (UIButton sender)
    {
        StopListening ();
        UpdateButtonStates (false, true, true, true);
    }

    partial void btnSuspend_TouchUpInside (UIButton sender)
    {
        SuspendRecognition ();
        UpdateButtonStates (true, false, true, false);
    }

    partial void btnResume_TouchUpInside (UIButton sender)
    {
        ResumeRecognition ();
        UpdateButtonStates (true, false, false, true);
    }
}

OpenEarsEventsObserverDelegate:

// nothing much here just to check the status and debugging 

public class OpenEarsEventsObserverDelegate:OEEventsObserverDelegate
{
    OpenEarsNewApiViewController _controller;

    public OpenEarsNewApiViewController controller {
        get {
            return _controller;
        }
        set {
            _controller = value;
        }
    }

    public OpenEarsEventsObserverDelegate (OpenEarsNewApiViewController ctrl)
    {
        controller = ctrl;
    }

    public override void PocketsphinxRecognitionLoopDidStart()
    {
        //base.PocketsphinxRecognitionLoopDidStart();

        Console.WriteLine ("Pocketsphinx is starting up");
        controller.UpdateStatus ("Pocketsphinx is starting up");
    }

    public override void PocketsphinxDidReceiveHypothesis (Foundation.NSString hypothesis, Foundation.NSString recognitionScore, Foundation.NSString utteranceID)
    {
        controller.UpdateText ("Heard: " + hypothesis);
        controller.Say ("You said: " + hypothesis);
    }

    public override void PocketSphinxContinuousSetupDidFail ()
    {

    }

    public override void PocketsphinxDidCompleteCalibration ()
    {
        Console.WriteLine ("Pocket calibration is complete");
        controller.UpdateStatus ("Pocket calibratio is complete");
    }

    public override void PocketsphinxDidDetectSpeech ()
    {

    }

    public override void PocketsphinxDidStartListening ()
    {
        Console.WriteLine ("Pocketsphinx is now listening");
        controller.UpdateStatus ("Pocketphinx is now listening");
        controller.UpdateButtonStates (true, false, false, true);
    }

    public override void PocketsphinxDidStopListening ()
    {

    }

    public override void PocketsphinxDidStartCalibration ()
    {
        Console.WriteLine ("Pocketsphinx calibration has started.");
        controller.UpdateStatus ("Pocketsphinx calibration has started");
    }

    public override void PocketsphinxDidResumeRecognition ()
    {

    }

    public override void PocketsphinxDidSuspendRecognition ()
    {

    }

    public override void PocketsphinxDidDetectFinishedSpeech ()
    {

    }

    public override void FliteDidStartSpeaking ()
    {

    }

    public override void FliteDidFinishSpeaking ()
    {

    }
}

这在iOS模拟器上完美运行,但不能在真实设备上运行。

模拟器屏幕截图。

我在设备上运行时收到此错误消息。所有接口都收到相同的消息。

Exception Message :Wrapper type 'OpenEars.OEEventsObserver' is missing its native ObjectiveC class 'OEEventsObserver'.

2015-05-15 12:55:26.996 OpenEarsNewApi[1359:231264] Unhandled managed  exception: Exception has been thrown by the target of an invocation.  (System.Reflection.TargetInvocationException)
at System.Reflection.MonoCMethod.InternalInvoke (System.Object obj,   System.Object[] parameters) [0x00016] in   /Developer/MonoTouch/Source/mono/mcs/class/corlib/System.Reflection/MonoMethod.cs:543 

我是否缺少与绑定设备有关的任何内容?

我也尝试使用make文件构建相同的.dll,但是得到了相同的错误消息。

对于构建OpenEars框架:

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphonesimulator8.2 -arch i386 -configuration Release clean build

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphoneos -arch armv7 -configuration Release clean build

用于生成OpenEars.dll的文件

BTOUCH=/Developer/MonoTouch/usr/bin/btouch-native

all: OpenEars.dll


OpenEars.dll: AssemblyInfo.cs OpenEars.cs libOpenEars.a
$(BTOUCH) -unsafe --new-style -out:$@ OpenEars.cs -x=AssemblyInfo.cs --link-with=libOpenEars.a,libOpenEars.a

clean:
   -rm -f *.dll

此处检查完整的mtouch 错误日志

$lipo -info libOpenEars.a

Architectures in the fat file: libOpenEars.a are: i386 armv7 

检查$ nm -arch armv7 libOpenEars.a

nm命令输出在这里

检查模拟器中是否存在OEEvent(i386)

$ nm -arch i386 libOpenEars.a | grep OEEvent

输出值

U _OBJC_CLASS_$_OEEventsObserver
00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00002174 S _OBJC_CLASS_$_OEEventsObserver
00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate
00002188 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

检查armEvent7中是否存在OEEvent

$nm -arch armv7 libOpenEars.a | grep OEEvent

输出值

 U _OBJC_CLASS_$_OEEventsObserver
00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning:    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00001cb4 S _OBJC_CLASS_$_OEEventsObserver
00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate
00001cc8 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

我不确定我缺少什么。 是的,这里有很多语法错误,我感谢您花时间阅读本文。

感谢@poupou和@Halle的宝贵意见。 最后,我使用包括arm64和x86_64(必须)在内的所有架构来构建胖二进制文件。 使用lipo将所有内容打包在一个package.Now中,它现在就像魅力!...还要设置项目属性-> Advanced-> SupportedArchi。 -> ARMv7,可在ipad 2和iPhone 4等设备上运行。仍然需要在iPhone 6和6+中进行测试,我希望它们也可能支持,因为它们是arm64系列。 我不确定这在ARMv7(iPhone 5,iPhone 5c,iPad 4)上如何工作。 我在OpenEars v2.03中没有看到ARMv7s支持。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM