简体   繁体   English

在将MATLAB稀疏过滤移植到F#时,我应该使用什么解算器来替换minFunc

[英]What solver should I use to replace the minFunc when porting MATLAB Sparse Filtering to F#

I have the following Sparse Filtering MATLAB code that I would like to port to F#. 我有以下稀疏过滤MATLAB代码 ,我想移植到F#。 I am aware of F# Type Provider for MATLAB but can't use it here because it would create a dependency on MATLAB (I can use it for testing) 我知道MATLAB的F#Type Provider,但是不能在这里使用它因为它会在MATLAB上创建依赖(我可以用它来测试)

function [optW] = SparseFiltering(N, X);
   % N = # features to learn, X = input data (examples in column)
   % You should pre-process X by removing the DC component per example,
   % before calling this function.
   % e.g., X = bsxfun(@minus, X, mean(X));
   addpath minFunc/ % Add path to minFunc optimization package
   optW = randn(N, size(X, 1));
   optW = minFunc(@SparseFilteringObj, optW(:), struct('MaxIter', 100), X, N);
   optW = reshape(optW, [N, size(X, 1)]);
end

function [Obj, DeltaW] = SparseFilteringObj (W, X, N)
   % Reshape W into matrix form
   W = reshape(W, [N, size(X,1)]);
   % Feed Forward
   F = W*X; % Linear Activation
   Fs = sqrt(F.ˆ2 + 1e-8); % Soft-Absolute Activation
   [NFs, L2Fs] = l2row(Fs); % Normalize by Rows
   [Fhat, L2Fn] = l2row(NFs'); % Normalize by Columns
   % Compute Objective Function
   Obj = sum(sum(Fhat, 2), 1);
   % Backprop through each feedforward step
   DeltaW = l2grad(NFs', Fhat, L2Fn, ones(size(Fhat)));
   DeltaW = l2grad(Fs, NFs, L2Fs, DeltaW');
   DeltaW = (DeltaW .* (F ./ Fs)) * X';
   DeltaW = DeltaW(:);
end

function [Y,N] = l2row(X) % L2 Normalize X by rows
   % We also use this to normalize by column with l2row(X')
   N = sqrt(sum(X.ˆ2,2) + 1e-8);
   Y = bsxfun(@rdivide,X,N);
end

function [G] = l2grad(X,Y,N,D) % Backpropagate through Normalization
   G = bsxfun(@rdivide, D, N) - bsxfun(@times, Y, sum(D.*X, 2) ./ (N.ˆ2));
end

I understand most of the MATLAB code, but I'm not sure what the equivalent of MATLAB's minFunc is in .Net. 我理解大部分的MATLAB代码,但我不确定MATLAB的minFunc在.Net中的等价物。 I believe I want one of the Microsoft.SolverFoundation.Solvers . 我相信我想要一个Microsoft.SolverFoundation.Solvers According to MATLAB's site 根据MATLAB的网站

...the default parameters of minFunc call a quasi-Newton strategy, where limited-memory BFGS updates with Shanno-Phua scaling are used in computing the step direction, and a bracketing line-search for a point satisfying the strong Wolfe conditions is used to compute the step direction. ... minFunc的默认参数调用准牛顿策略,其中使用Shanno-Phua缩放的有限内存BFGS更新用于计算步进方向,并且使用包围线搜索满足强Wolfe条件的点计算步进方向。 In the line search, (safeguarded) cubic interpolation is used to generate trial values, and the method switches to an Armijo back-tracking line search on iterations where the objective function enters a region where the parameters do not produce a real valued output 在线搜索中,(安全保护的)三次插值用于生成试验值,并且该方法切换到Armijo反向跟踪线搜索迭代,其中目标函数进入参数不产生实值输出的区域

Given the above information, can anyone confirm that the Microsoft.SolverFoundation.Solvers.CompactQuasiNewtonModel is the right way to go? 鉴于以上信息,任何人都可以确认Microsoft.SolverFoundation.Solvers.CompactQuasiNewtonModel是正确的方法吗?

Also, are there any other obvious "gotchas" when porting the above code to F#? 此外,将上述代码移植到F#时还有其他明显的“陷阱”吗? (new to this type of port) (这种类型的端口新)

I think CompactQuasiNewtonSolver is your best bet. 我认为CompactQuasiNewtonSolver是你最好的选择。 If you take a look at SolverFoundation's samples , there is CQN sample which demonstrates how to implement and solve the Rosenbrock function. 如果你看一下SolverFoundation的样本 ,就会有CQN示例演示如何实现和解决Rosenbrock函数。 Its results are inline with what I see in minFunc's Rosenbrock example . 它的结果与我在minFunc的Rosenbrock示例中看到的结果一致 The sample above is in C#; 上面的示例是C#; but it should be easy to translate to F#. 但它应该很容易翻译成F#。

Also, are there any other obvious "gotchas" when porting the above code to F#? 此外,将上述代码移植到F#时还有其他明显的“陷阱”吗? (new to this type of port)? (这种类型的端口新手)?

You probably need a good linear algebra package to get close to MATLAB code. 你可能需要一个好的线性代数包来接近MATLAB代码。 Math.Net seems to be an ideal choice since it has good F# support. Math.Net似乎是一个理想的选择,因为它具有良好的F#支持。

Alternatively, you can reimplement the code using Accord.NET framework. 或者,您可以使用Accord.NET框架重新实现代码。 The framework has an implementation of L-BFGS algorithm for optimization in Machine Learning, so it is closer to your purpose. 该框架具有L-BFGS算法的实现,用于机器学习中的优化,因此它更接近您的目的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM