[英]Xamarin OpenEars Native Binding Not working on Device but works on Simulator
我一直在xamarin iOS綁定項目中使用OpenEars v2.03 iOS框架項目。 讓我解釋一下到目前為止我做了什么。我是XCode,Xamarin和所有這些Binding的新手。 這將是一個大問題,請屏住呼吸……
1)在Xcode for Simulator中構建OpenEars框架項目。 從Framework / OpenEars.framework / Versions / Current /復制了“ OpenEars”文件,並將其重命名為“ libOpenEars-i386.a ”
同樣,通過將設備連接到Mac並選擇目標到我的iPhone,為iPhone 4s設備構建相同的庫。 最后復制生成的OpenEars,並將其重命名為“ libOpenEars-armv7.a ”
2)使用lipo命令,使用以下命令將兩個文件(libOpenEars-i386.a,libOpenEars-armv7.a)捆綁為一個文件“ libOpenEars.a”。
lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a
3)在Xamarin Studio中創建一個Binding項目並添加libOpenEars.a,它會自動生成一個libOpenEars.linkwith.cs 。 以下是以下代碼,
using System;
using ObjCRuntime;
[assembly: LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]
我嘗試過更改Liker標志LinkerFlags =“ -lstdc ++ -lc ++ -ObjC”和SmartLink = false。
4)我的ApiDefinition文件包含OpenEars的所有接口,我在這里僅添加了一個接口。
[BaseType(typeof(NSObject))]
[Protocol]
interface OEEventsObserver
{
[Wrap ("WeakDelegate")]
OEEventsObserverDelegate Delegate { get; set; }
[Export ("delegate", ArgumentSemantic.Assign), NullAllowed]
NSObject WeakDelegate { get; set; }
}
5)將OpenEars.dll引用到我的iOS示例項目。
6)在綁定庫本身中添加語言模型和聲學模型。 (即使不需要動態語言模型生成,我也使用了OpenEars Xamarin git中的舊OpenEars示例項目,沒有使用新的DynamicLanguageModel生成器,而是修改了示例以進行最新更改)。
視圖控制器:
public partial class OpenEarsNewApiViewController : UIViewController
{
OEEventsObserver observer;
OEFliteController fliteController;
OEPocketsphinxController pocketSphinxController;
String pathToLanguageModel;
String pathToDictionary;
String pathToAcousticModel;
String firstVoiceToUse;
String secondVoiceToUse;
static bool UserInterfaceIdiomIsPhone {
get { return UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone; }
}
public void init()
{
try
{
observer = new OEEventsObserver();
observer.Delegate = new OpenEarsEventsObserverDelegate (this);
pocketSphinxController = new OEPocketsphinxController ();
fliteController = new OEFliteController();
firstVoiceToUse = "cmu_us_slt";
secondVoiceToUse = "cmu_us_rms";
pathToLanguageModel = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.languagemodel";
pathToDictionary = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.dic";
pathToAcousticModel = NSBundle.MainBundle.ResourcePath;
}
catch(Exception e) {
Console.WriteLine ("Exception Message :"+e.Message);
Console.WriteLine ("Inner Exception Mesage :"+e.InnerException.Message);
}
}
public OpenEarsNewApiViewController (IntPtr handle) : base (handle)
{
init ();
}
#region Update
public void UpdateStatus (String text)
{
txtStatus.Text = text;
}
public void UpdateText (String text)
{
txtOutput.Text = text;
}
public void UpdateButtonStates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)
{
btnStartListening.Hidden = hidden1;
btnStopListening.Hidden = hidden2;
btnSuspend.Hidden = hidden3;
btnResume.Hidden = hidden4;
}
public void Say (String text)
{
//fliteController.SaywithVoice (text, secondVoiceToUse);
}
public void StartListening ()
{
//pocketSphinxController.RequestMicPermission ();
if (!pocketSphinxController.IsListening) {
//NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@"%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@",[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@"DMP"];
pocketSphinxController.StartListeningWithLanguageModelAtPath (
pathToLanguageModel,
pathToDictionary,
pathToAcousticModel,
false
);
} else {
new UIAlertView ("Notify !!","Already Listening",null,"OK","Stop").Show();
}
}
public void StopListening ()
{
//pocketSphinxController.StopListening ();
}
public void SuspendRecognition ()
{
pocketSphinxController.SuspendRecognition ();
}
public void ResumeRecognition ()
{
pocketSphinxController.ResumeRecognition ();
}
#endregion
#region Event Handlers
partial void btnStartListening_TouchUpInside (UIButton sender)
{
try
{
StartListening();
//fliteController.Init();
//Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
//fliteController.Say("Hai", new OEFliteVoice());
UpdateButtonStates (true, false, false, true);
Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
}
catch(Exception e)
{
Console.WriteLine(e.Message);
}
}
partial void btnStopListening_TouchUpInside (UIButton sender)
{
StopListening ();
UpdateButtonStates (false, true, true, true);
}
partial void btnSuspend_TouchUpInside (UIButton sender)
{
SuspendRecognition ();
UpdateButtonStates (true, false, true, false);
}
partial void btnResume_TouchUpInside (UIButton sender)
{
ResumeRecognition ();
UpdateButtonStates (true, false, false, true);
}
}
OpenEarsEventsObserverDelegate:
// nothing much here just to check the status and debugging
public class OpenEarsEventsObserverDelegate:OEEventsObserverDelegate
{
OpenEarsNewApiViewController _controller;
public OpenEarsNewApiViewController controller {
get {
return _controller;
}
set {
_controller = value;
}
}
public OpenEarsEventsObserverDelegate (OpenEarsNewApiViewController ctrl)
{
controller = ctrl;
}
public override void PocketsphinxRecognitionLoopDidStart()
{
//base.PocketsphinxRecognitionLoopDidStart();
Console.WriteLine ("Pocketsphinx is starting up");
controller.UpdateStatus ("Pocketsphinx is starting up");
}
public override void PocketsphinxDidReceiveHypothesis (Foundation.NSString hypothesis, Foundation.NSString recognitionScore, Foundation.NSString utteranceID)
{
controller.UpdateText ("Heard: " + hypothesis);
controller.Say ("You said: " + hypothesis);
}
public override void PocketSphinxContinuousSetupDidFail ()
{
}
public override void PocketsphinxDidCompleteCalibration ()
{
Console.WriteLine ("Pocket calibration is complete");
controller.UpdateStatus ("Pocket calibratio is complete");
}
public override void PocketsphinxDidDetectSpeech ()
{
}
public override void PocketsphinxDidStartListening ()
{
Console.WriteLine ("Pocketsphinx is now listening");
controller.UpdateStatus ("Pocketphinx is now listening");
controller.UpdateButtonStates (true, false, false, true);
}
public override void PocketsphinxDidStopListening ()
{
}
public override void PocketsphinxDidStartCalibration ()
{
Console.WriteLine ("Pocketsphinx calibration has started.");
controller.UpdateStatus ("Pocketsphinx calibration has started");
}
public override void PocketsphinxDidResumeRecognition ()
{
}
public override void PocketsphinxDidSuspendRecognition ()
{
}
public override void PocketsphinxDidDetectFinishedSpeech ()
{
}
public override void FliteDidStartSpeaking ()
{
}
public override void FliteDidFinishSpeaking ()
{
}
}
這在iOS模擬器上完美運行,但不能在真實設備上運行。
我在設備上運行時收到此錯誤消息。所有接口都收到相同的消息。
Exception Message :Wrapper type 'OpenEars.OEEventsObserver' is missing its native ObjectiveC class 'OEEventsObserver'.
2015-05-15 12:55:26.996 OpenEarsNewApi[1359:231264] Unhandled managed exception: Exception has been thrown by the target of an invocation. (System.Reflection.TargetInvocationException)
at System.Reflection.MonoCMethod.InternalInvoke (System.Object obj, System.Object[] parameters) [0x00016] in /Developer/MonoTouch/Source/mono/mcs/class/corlib/System.Reflection/MonoMethod.cs:543
我是否缺少與綁定設備有關的任何內容?
我也嘗試使用make文件構建相同的.dll,但是得到了相同的錯誤消息。
對於構建OpenEars框架:
xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphonesimulator8.2 -arch i386 -configuration Release clean build
xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphoneos -arch armv7 -configuration Release clean build
用於生成OpenEars.dll的文件
BTOUCH=/Developer/MonoTouch/usr/bin/btouch-native
all: OpenEars.dll
OpenEars.dll: AssemblyInfo.cs OpenEars.cs libOpenEars.a
$(BTOUCH) -unsafe --new-style -out:$@ OpenEars.cs -x=AssemblyInfo.cs --link-with=libOpenEars.a,libOpenEars.a
clean:
-rm -f *.dll
$lipo -info libOpenEars.a
Architectures in the fat file: libOpenEars.a are: i386 armv7
檢查$ nm -arch armv7 libOpenEars.a
檢查模擬器中是否存在OEEvent(i386)
$ nm -arch i386 libOpenEars.a | grep OEEvent
輸出值
U _OBJC_CLASS_$_OEEventsObserver
00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00002174 S _OBJC_CLASS_$_OEEventsObserver
00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate
00002188 S _OBJC_METACLASS_$_OEEventsObserver
U _OBJC_CLASS_$_OEEventsObserver
00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
檢查armEvent7中是否存在OEEvent
$nm -arch armv7 libOpenEars.a | grep OEEvent
輸出值
U _OBJC_CLASS_$_OEEventsObserver
00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00001cb4 S _OBJC_CLASS_$_OEEventsObserver
00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate
00001cc8 S _OBJC_METACLASS_$_OEEventsObserver
U _OBJC_CLASS_$_OEEventsObserver
00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
我不確定我缺少什么。 是的,這里有很多語法錯誤,我感謝您花時間閱讀本文。
感謝@poupou和@Halle的寶貴意見。 最后,我使用包括arm64和x86_64(必須)在內的所有架構來構建胖二進制文件。 使用lipo將所有內容打包在一個package.Now中,它現在就像魅力!...還要設置項目屬性-> Advanced-> SupportedArchi。 -> ARMv7,可在ipad 2和iPhone 4等設備上運行。仍然需要在iPhone 6和6+中進行測試,我希望它們也可能支持,因為它們是arm64系列。 我不確定這在ARMv7(iPhone 5,iPhone 5c,iPad 4)上如何工作。 我在OpenEars v2.03中沒有看到ARMv7s支持。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.