简体   繁体   English

为什么我的一个函数运行得这么慢?

[英]Why is one of my functions running so much slower?

I'm writing an implementation of a couple of methods to find the natural logs of numbers using GMP in C. I have two functions, both of which work, but one of which runs a lot slower than the other. 我正在编写一些方法的实现,以便在C中使用GMP查找数字的自然日志。我有两个函数,两个函数都有效,但其中一个运行速度比另一个慢很多。 The issue is that it is the one that I expected to be faster that is the slowest. 问题是,我认为它是速度最慢的那个。

Below are the two relevant functions, while the full file with int main can be found here , while the required ln_2.txt file is here . 下面是两个相关的函数,而int main的完整文件可以在这里找到,而所需的ln_2.txt文件在这里

void taylor_log(mpf_t R, 
                const mpf_t N, 
                const mpf_t T)
{
    mpf_t x, y, r, pr, tmp, d;
    int n = 1;
    mpf_init(x);
    mpf_init(y);
    mpf_init(tmp);
    mpf_init(d);

    mpf_sub_ui(x, N, 1);
    mpf_init_set(y, x);

    mpf_init_set(r, x);
    mpf_init_set_ui(pr, 0);    

    mpf_sub(d, r, pr);
    mpf_abs(d, d);
    while(mpf_cmp(d, T) > 0)
    {
        mpf_set(pr, r);

        mpf_mul(y, y, x);
        mpf_div_ui(tmp, y, ++n);
        mpf_sub(r, r, tmp);

        mpf_mul(y, y, x);
        mpf_div_ui(tmp, y, ++n);
        mpf_add(r, r, tmp);

        mpf_sub(d, r, pr);
        mpf_abs(d,d);
    }
    printf("%d\n", n);
    mpf_set(R, r);
 }


void hyperbolic_log(mpf_t R,
                    const mpf_t N,
                    const mpf_t T)
{
    mpf_t x, x2, r, pr, tmp, d;
    int n = 1;   
    mpf_init(x);
    mpf_init(x2);
    mpf_init(tmp);
    mpf_init(d);

    mpf_sub_ui(x, N, 1);
    mpf_add_ui(tmp, N, 1);
    mpf_div(x, x, tmp);

    mpf_init_set(r, x);
    mpf_init_set_ui(pr, 0);

    mpf_mul(x2, x, x);

    mpf_sub(d, r, pr);
    mpf_abs(d,d);
    while(mpf_cmp(d, T) > 0)
    {
        mpf_set(pr, r);
        ++n;

        mpf_mul(x, x, x2);
        mpf_div_ui(tmp, x, ++n);
        mpf_add(r, r, tmp);

        mpf_sub(d, r, pr);
        mpf_abs(d,d);
    }
    printf("%d\n", n);
    mpf_mul_ui(R, r, 2);
}

Now the second function should, in theory at least, run quicker as it has fewer instructions per loop and typically executes fewer loops due to a quicker convergence. 从理论上讲,第二个函数至少应该运行得更快,因为每个循环的指令更少,并且由于更快的收敛,通常执行更少的循环。 This is not what I see in practice, as when I calculated ln(2) to 10000 decimal places using at least 33296 bits in the calculations, both gave the correct result but the first method completed in about 0.150s seconds while the second takes about 1 second. 这不是我在实践中看到的,因为当我在计算中使用至少33296位计算ln(2)到10000个小数位时,两者都给出了正确的结果,但第一种方法在大约0.150秒内完成,而第二种方法在第二种方法中完成1秒。

I have no idea what is causing one function to run so much slower than the other, and any help would be appreciated. 我不知道是什么导致一个函数运行速度比另一个慢得多,任何帮助将不胜感激。

Edit: For clarity I forgot to mention that before being passed to the functions the values are normalized to the range [0.5, 1). 编辑:为了清楚起见,我忘了提及在传递给函数之前,值被归一化到范围[0.5,1]。 This is the same for both functions I've confirmed this works in the program, so technically my issue is when 0.5 is provided to the algorithms. 对于我已经确认在程序中有效的两个函数,这是相同的,所以从技术上讲,我的问题是当算法提供0.5时。

Running the program with added timing functions, the performance hit comes, for N = 2 , from the mpf_mul operations. 使用添加的计时功能运行程序,从mpf_mul操作中,对于N = 2 ,性能命中。

In the Taylor case, 在泰勒案中,

mpf_mul(y, y, x);

has x being -0.5 from mpf_get_d_2exp((long *)&e, N)) - 1 , which in the gmp structure has a size of 1, whatever the precision requested. 如果x-0.5来自mpf_get_d_2exp((long *)&e, N)) - 1 ,它在gmp结构中的大小为1,无论请求的精度如何。

In the Hyperbolic case, 在双曲线情况下,

mpf_mul(x, x, x2);

has x2 being 0.111... from x20.111...来自

T  = mpf_get_d_2exp((long *)&e, N))
x  = (T - 1)/(T + 1)
x2 = x * x

which in the gmp structure has the size of the requested precision. 在gmp结构中具有所请求精度的大小。 The multiplication in this case is much more expensive. 在这种情况下的乘法要贵得多。

edit 编辑

It seems actually the culprit is mpf_div(x, x, tmp) which forces gmp to use the whole precision. 实际上,罪魁祸首是mpf_div(x, x, tmp) ,它迫使gmp使用整个精度。 x2 being calculated from x gets the same precision, thus the mpf_mul are slower. x计算的x2具有相同的精度,因此mpf_mul较慢。

I've benchmarked your program for both cases. 我已针对这两种情况对您的程序进行了基准测试。

  • These are nanosecond resolution. 这些是纳秒分辨率。
  • They compensate for the overhead of the benchmark calls [the compensation is calculated by the bnctst at the beginning] 它们补偿了基准调用的开销[补偿由开头的bnctst计算]
  • Although the output is floating point fractional seconds, during the benchmark run they're using 64 bit integer times 虽然输出是浮点小数秒,但在基准测试期间,它们使用的是64位整数倍
  • Outer functions are benchmarked. 外部功能是基准测试。
  • Loops are benchmarked. 循环是基准测试。
  • All mpf_* have their own. 所有mpf_*都有自己的。
  • If a benchmark is invoked more than once (eg loops), it will show average, count, and min/max) in addition to total. 如果多次调用基准(例如循环),除了总数之外,它还将显示平均值,计数值和最小值/最大值。
  • The benchmarks are output in the order the calls appear in your source file. 基准测试按照调用在源文件中出现的顺序输出。

The gmp_printf calls have a wrapper around them that wraps the output at 78 chars and the final output for hyperbolic has been elided to fit in the SO 30,000 character limit. gmp_printf调用在它们周围有一个包装器,它将输出包装在78个字符,并且双曲线的最终输出已被省略,以符合SO 30,000字符限制。

23:32:14.233952283 NEWDAY 11/05/15
23:32:14.233952283 ph: starting 13378 ...
23:32:14.234557628 ph: ARGV tayhypgo ...

--------------------------------------------------------------------------------
23:32:16 tayhypgo: ARGV 2 10000 33296 1
23:32:17 ovrgo: SDIR /home/cae/preserve/ovrbnc/tayhyp
23:32:18 ovrgo: /home/cae/preserve/ovrstk/gen/tayhyp/tayhyp 2 10000 33296 1
bnctst: BEST bncmin=21 bncmax=-1 skipcnt=1000
bnctst: AVG tot=0.000000000

bnctst: SKP tot=0.000023975 avg=0.000000023 cnt=1,000
bnctst: BNCDMP   min=0.000000000 max=0.000000000

2
PRINTF 0.50000000000000000000
Case 1
33207
PRINTF 0.693147180559945309417232121458176568075500134360255254120680009493393
PRINTF 62196969471560586332699641868754200148102057068573368552023575813055703
PRINTF 26707516350759619307275708283714351903070386238916734711233501153644979
PRINTF 55239120475172681574932065155524734139525882950453007095326366642654104
PRINTF 23915781495204374043038550080194417064167151864471283996817178454695702
PRINTF 62716310645461502572074024816377733896385506952606683411372738737229289
PRINTF 56493547025762652098859693201965058554764703306793654432547632744951250
PRINTF 40606943814710468994650622016772042452452961268794654619316517468139267
PRINTF 25041038025462596568691441928716082938031727143677826548775664850856740
PRINTF 77648451464439940461422603193096735402574446070308096085047486638523138
PRINTF 18167675143866747664789088143714198549423151997354880375165861275352916
PRINTF 61000710535582498794147295092931138971559982056543928717000721808576102
PRINTF 52368892132449713893203784393530887748259701715591070882368362758984258
PRINTF 91853530243634214367061189236789192372314672321720534016492568727477823
PRINTF 44535347648114941864238677677440606956265737960086707625719918473402265
PRINTF 14628379048830620330611446300737194890027436439650025809365194430411911
PRINTF 50608094879306786515887090060520346842973619384128965255653968602219412
PRINTF 29242075743217574890977067526871158170511370091589426654785959648906530
PRINTF 58460258668382940022833005382074005677053046787001841624044188332327983
PRINTF 86349001563121889560650553151272199398332030751408426091479001265168243
PRINTF 44389357247278820548627155274187724300248979454019618723398086083166481
PRINTF 14909306675193393128904316413706813977764981769748689038877899912965036
PRINTF 19270710889264105230924783917373501229842420499568935992206602204654941
PRINTF 51061391878857442455775102068370308666194808964121868077902081815885800
PRINTF 01688115973056186676199187395200766719214592236720602539595436541655311
PRINTF 29517598994005600036651356756905124592682574394648316833262490180382424
PRINTF 08242314523061409638057007025513877026817851630690255137032340538021450
PRINTF 19015374029509942262995779647427138157363801729873940704242179972266962
PRINTF 97993931270693574724049338653087975872169964512944649188377115670167859
PRINTF 88049818388967841349383140140731664727653276359192335112333893387095132
PRINTF 09059272185471328975470797891384445466676192702885533423429899321803769
PRINTF 15497334026754675887323677834291619181043011609169526554785973289176354
PRINTF 55567428638774639871019124317542558883012067792102803412068797591430812
PRINTF 83307230300883494705792496591005860012341561757413272465943068435465211
PRINTF 13502154434153995538185652275022142456644000627618330320647272572197515
PRINTF 29082785684213207959886389672771195522188190466039570097747065126195052
PRINTF 78932296088931405625433442552392062030343941777357945592125901992559114
PRINTF 84402423901255425900312953705192206150643458378787300203541442178575801
PRINTF 32364516607099143831450049858966885772221486528821694181270488607589722
PRINTF 03216663128378329156763074987298574638928269373509840778049395004933998
PRINTF 76264755070316221613903484529942491724837340613662263834936811168416705
PRINTF 69252147513839306384553718626877973288955588716344297562447553923663694
PRINTF 88877823890174981027356552405051854773061944052423221255902483308277888
PRINTF 88905962911972995457441562451248592683112607467972816380902500056559991
PRINTF 46128332543581114048482060640824224792403855764762350311003242597091425
PRINTF 01114615584830670012583182191534720747411194009835573272826144273821397
PRINTF 07047795625967057902303384806171345555368553758106574973444792251119654
PRINTF 61618278960100685129653954796586637835224736245460935850360506784143911
PRINTF 44523145778033591792112795570505555451438788818815351948593446724642949
PRINTF 86405062651842447539566378337348220753329448130649336035461010177464932
PRINTF 67877167198612073968320123596077290246830459403130563776313240108042028
PRINTF 54359026945094030740014933950767316028502869730318718239984335257435499
PRINTF 56085025660897833955642114948073393626075102381833141100470890395013433
PRINTF 02974134748405406158775396888381540769801776730369991074924697847843128
PRINTF 43036411289202801227256346839162335478772734006395865717981906935812738
PRINTF 70343353131890503838456164444429279690638379690924413039656009876635846
PRINTF 27766076053486974908593811939309251791198855527765356660762439356877194
PRINTF 23316664283820074481630786522923565982658627591874752087509144760901697
PRINTF 35693572318242499194754944316314633922707432445903024825444124903594099
PRINTF 00711377326310998077723937579092667787226299567773759125268754691760395
PRINTF 50147363373746164507645777159814661075839930304323134949658648228467849
PRINTF 52475402979689001510984243408167226410534651753188957093414626589139801
PRINTF 73123676248874585502699619246678052425882378995907144185753559519019313
PRINTF 82627559350018482608107690649406792443588583150352601704500934671408738
PRINTF 47278951678454152522670236969054686984460721098217747366065475232420890
PRINTF 63817688335653308345429052023662173681689021810091859270116416256337109
PRINTF 21091919381108840837199549413952808743847659331516464524483714349554707
PRINTF 17674786446677777732624005994428038830050520639602544872219401004824568
PRINTF 35584914116372021650148290488954105485988582139146739280127960783765798
PRINTF 03019019789583129844584235055162804713845709836714317959849798338493664
PRINTF 30513574397784640802839449964928361772012175297196011151892334756646644
PRINTF 87214704005952700208247971196291259570791322875250242252752285663130430
PRINTF 51121949377154294071049224714797815453298506882289269488223421814177347
PRINTF 79122897059303851678644123512773213799470513130865615260917674214128978
PRINTF 99794277108410495559054929958460312928956410723379347221982629602465778
PRINTF 85985564915354252627693645781715768750562769595194061476951336213589662
PRINTF 11036723901861834491008470560232408857407364980649133504977272298541215
PRINTF 59922564905903204165800777130365484448790985802916806772298738484805391
PRINTF 43712688284515899628929046495360697649132264972901758104649039660124902
PRINTF 68091442266963065322405327931805201389903025067016459461450932789918749
PRINTF 75058845950892812313146535841156454786288876711792759444189103227033972
PRINTF 69941078262094906693919323364952884039841053417413103028429829838459727
PRINTF 45096891535052298062324409231381105784356547340986751938386072824078975
PRINTF 23799065221536828449641519520436143002340641177671720819470742812470616
PRINTF 07867913971425756102687732567256651877939886954356124725250382411053283
PRINTF 79424949624250753859855661301876325664637266952818991824470283360644583
PRINTF 37646946313432313914423530118060852644531774177829165319868877107483800
PRINTF 21596407193754688536272412072273614121001398409439077271839347179202289
PRINTF 57418965620956779770395475952722338278919094369589561415818837608560348
PRINTF 34063384906401173599284897189611655350881207229167416716902882811061704
PRINTF 37298078875517486927574003575530600731535192433495073405019379393181058
PRINTF 87674528352364165276903716517492244596252440613260215331480242729696226
PRINTF 77771345064205355151024831061088767266294511910945817001717135117299890
PRINTF 02590464476306635972419905184702491108868012469419154965444117599593375
PRINTF 38678615328851050554859575190528299267459748057766950034966002392489117
PRINTF 57605776389825650559728570971462382136461281575699631424743375953147095
PRINTF 06808920446151687031940548311378504976842184577809636035835677681422980
PRINTF 40151305880322138870061436664661951545626428267979005809070233129442562
PRINTF 72924492983069495929158131775714230468246948685772631408192667492977253
PRINTF 97437563652031075562027651175474076728453283199975064855371567852134047
PRINTF 92052661535026151150356406251891749898352529576071186118970319875837950
PRINTF 86582124471448554809890971322784110436163215054456189256415686480097687
PRINTF 52717279363166235676934305070540096078529757697334781633978650356344759
PRINTF 60831130589187410887754764989661407670935400992896790330051971292292946
PRINTF 55908813155093792311652676248200974255320501351088304034115018668628172
PRINTF 78444851832099989311381048011484736433826155047482700010272787155537960
PRINTF 76269777262330614285973220370177164880139359067316515931312072256600100
PRINTF 57306787815875968237327432664863152551632539069522055152460415473672664
PRINTF 71582971975910017632924675539194792441259685598691843126671687606611477
PRINTF 42124931315475771110299052756988494792182854292007719911381106402013108
PRINTF 50296702223726371927432520508605883939945814143535946600955513703685151
PRINTF 43953442513225162327448176928474833516018143148377577291026597705021637
PRINTF 55133176163209069621023638527066273373626901536962550736108842415763081
PRINTF 51549058618960450223864851723710305700902066826458514960037828134831228
PRINTF 85470859777556219575228696991467195454031081669626969029689622271555266
PRINTF 81230033365499284279246122503955709996146929716943393515445085575161428
PRINTF 66963351826393901192551987712707669409200410842997382985072562990345850
PRINTF 43358447819361749768796368848151701460384149318945856731095588979938416
PRINTF 66166974318421510886781827472747719220624397780660846646309778881953155
PRINTF 96103595567520455859572036591635522582959536734037782112409893988073073
PRINTF 54183621312108837517038036823616360448183264453177108673935775104191422
PRINTF 09069669178226474762487269198026128610837448652330441858074374067999246
PRINTF 59359146430681924558555771836314041797245925837265876930872766629910804
PRINTF 50434859149599696250194105269190482762996890300360235763642425439767056
PRINTF 45825409964140414509946703453877431257026552921996084282110706720391105
PRINTF 26250561980914926429443191308362902026158372172207479904166903500594072
PRINTF 81416049396536557273412233933280610337003702100821153419222729353491676
PRINTF 20798856258516798064791585869423911461787263578592913577266496560330715
PRINTF 30623284906061534893403415083718713352615870286424651827290435568456552
PRINTF 10616172517713662500525479944598546766181778602587780380603098444544523
PRINTF 17643005733785587904551361131625441388511369580948358495849149344212511
PRINTF 05356131748993037427322843117221670118781585253525367038010852035298366
PRINTF 28261634712978968330653152100024743219392167059907313340302283790531537
PRINTF 48333258860259686717330067302626129552556379339768173004852394067955491
PRINTF 41449465881533293253950529298238146676292254117063142342669344856493368
PRINTF 63253141874713964341590528373380575999077665816938171897175745861747239
PRINTF 10973841969976461995896793918544510980595857872497164218482449641744096
PRINTF 57751106908793193517216376221168703906403811218433433897183542006078817
PRINTF 35682569788674893046558524458837592126484177992083637592780521588288659
PRINTF 44443470717721204674663717403307076091891373969561434777418394409918035
PRINTF 48117653335878130833510705646029925695615913401856601359655561
tayhyp: BNCDMP mpf_set_default_prec(p) tot=0.000003063
tayhyp: BNCDMP mpf_init(nlog2) tot=0.000002623
tayhyp: BNCDMP mpf_inp_str(nlog2,l_2_in,10) tot=0.003018814
tayhyp: BNCDMP mpf_init_set_str(N,argv[1],10) tot=0.000001509
tayhyp: BNCDMP mpf_init_set_ui(T,10) tot=0.000000533
tayhyp: BNCDMP mpf_pow_ui(T,T,n) tot=0.000086848
tayhyp: BNCDMP mpf_ui_div(T,1,T) tot=0.000380477
tayhyp: BNCDMP mpf_init(V) tot=0.000000271
tayhyp: BNCDMP mpf_set_d(V,mpf_get_d_2exp(&e,N)) tot=0.000001757
tayhyp: BNCDMP mpf_init(R) tot=0.000000037
tayhyp: BNCDMP mpf_init(tmp) tot=0.000000084 avg=0.000000042 cnt=2
tayhyp: BNCDMP   min=0.000000058 max=0.000000068
tayhyp: BNCDMP mpf_mul_ui(tmp,nlog2,e) tot=0.000002245
tayhyp: BNCDMP mpf_add(R,R,tmp) tot=0.000001428
tayhyp: BNCDMP mpf_add(R,R,tmp) tot=0.224560256
tayhyp: BNCDMP mpf_init(x) tot=0.000000362
tayhyp: BNCDMP mpf_init(y) tot=0.000000037
tayhyp: BNCDMP mpf_init(d) tot=0.000000037
tayhyp: BNCDMP mpf_sub_ui(x,N,1) tot=0.000001437
tayhyp: BNCDMP mpf_init_set(y,x) tot=0.000000344
tayhyp: BNCDMP mpf_init_set(r,x) tot=0.000000050
tayhyp: BNCDMP mpf_init_set_ui(pr,0) tot=0.000000056
tayhyp: BNCDMP mpf_sub(d,r,pr) tot=0.017311085 avg=0.000001042 cnt=16,604
tayhyp: BNCDMP   min=0.000000223 max=0.000063322
tayhyp: BNCDMP mpf_abs(d,d) tot=0.000164212 avg=0.000000009 cnt=16,604
tayhyp: BNCDMP   min=0.000000026 max=0.000003257
tayhyp: BNCDMP mpf_abs(d,d) tot=0.223701027 avg=0.000013472 cnt=16,604
tayhyp: BNCDMP   min=0.000000109 max=0.000129095
tayhyp: BNCDMP cmpflg = mpf_cmp(d,T) tot=0.000159926 avg=0.000000009 cnt=16,604
tayhyp: BNCDMP   min=0.000000021 max=0.000009950
tayhyp: BNCDMP mpf_set(pr,r) tot=0.004447240 avg=0.000000267 cnt=16,603
tayhyp: BNCDMP   min=0.000000210 max=0.000004657
tayhyp: BNCDMP mpf_mul(y,y,x) tot=0.030792804 avg=0.000000927 cnt=33,206
tayhyp: BNCDMP   min=0.000000080 max=0.000088428
tayhyp: BNCDMP mpf_div_ui(tmp,y,++n) tot=0.128409872 avg=0.000003867 cnt=33,206
tayhyp: BNCDMP   min=0.000003728 max=0.000109781
tayhyp: BNCDMP mpf_sub(r,r,tmp) tot=0.017006018 avg=0.000001024 cnt=16,603
tayhyp: BNCDMP   min=0.000000651 max=0.000090142
tayhyp: BNCDMP mpf_add(r,r,tmp) tot=0.016594638 avg=0.000000999 cnt=16,603
tayhyp: BNCDMP   min=0.000000635 max=0.000064980
tayhyp: BNCDMP mpf_set(R,r) tot=0.000000239
23:32:19 tayhypgo: 2 10000 33296 1 -- complete

--------------------------------------------------------------------------------
23:32:19 tayhypgo: ARGV 2 10000 33296 2
23:32:19 ovrgo: SDIR /home/cae/preserve/ovrbnc/tayhyp
23:32:21 ovrgo: /home/cae/preserve/ovrstk/gen/tayhyp/tayhyp 2 10000 33296 2
bnctst: BEST bncmin=19 bncmax=-1 skipcnt=1000
bnctst: AVG tot=0.000000000

bnctst: SKP tot=0.000024750 avg=0.000000024 cnt=1,000
bnctst: BNCDMP   min=0.000000000 max=0.000000000

2
PRINTF 0.50000000000000000000
Case 2
20951
PRINTF elided for space
tayhyp: BNCDMP mpf_set_default_prec(p) tot=0.000003154
tayhyp: BNCDMP mpf_init(nlog2) tot=0.000002571
tayhyp: BNCDMP mpf_inp_str(nlog2,l_2_in,10) tot=0.002921480
tayhyp: BNCDMP mpf_init_set_str(N,argv[1],10) tot=0.000001506
tayhyp: BNCDMP mpf_init_set_ui(T,10) tot=0.000000554
tayhyp: BNCDMP mpf_pow_ui(T,T,n) tot=0.000090689
tayhyp: BNCDMP mpf_ui_div(T,1,T) tot=0.000290122
tayhyp: BNCDMP mpf_init(V) tot=0.000000179
tayhyp: BNCDMP mpf_set_d(V,mpf_get_d_2exp(&e,N)) tot=0.000001484
tayhyp: BNCDMP mpf_init(R) tot=0.000000051
tayhyp: BNCDMP mpf_init(tmp) tot=0.000000066 avg=0.000000033 cnt=2
tayhyp: BNCDMP   min=0.000000052 max=0.000000052
tayhyp: BNCDMP mpf_mul_ui(tmp,nlog2,e) tot=0.000001359
tayhyp: BNCDMP mpf_add(R,R,tmp) tot=0.000001408
tayhyp: BNCDMP mpf_init(x) tot=0.000000339
tayhyp: BNCDMP mpf_init(d) tot=0.000000039
tayhyp: BNCDMP mpf_sub_ui(x,N,1) tot=0.000001328
tayhyp: BNCDMP mpf_init_set(r,x) tot=0.000000537
tayhyp: BNCDMP mpf_init_set_ui(pr,0) tot=0.000000054
tayhyp: BNCDMP mpf_sub(d,r,pr) tot=0.011097537 avg=0.000001059 cnt=10,476
tayhyp: BNCDMP   min=0.000000739 max=0.000004082
tayhyp: BNCDMP mpf_abs(d,d) tot=0.000123286 avg=0.000000011 cnt=10,476
tayhyp: BNCDMP   min=0.000000021 max=0.000000364
tayhyp: BNCDMP cmpflg = mpf_cmp(d,T) tot=0.000141611 avg=0.000000013 cnt=10,476
tayhyp: BNCDMP   min=0.000000022 max=0.000002173
tayhyp: BNCDMP mpf_set(pr,r) tot=0.002824968 avg=0.000000269 cnt=10,475
tayhyp: BNCDMP   min=0.000000265 max=0.000003199
tayhyp: BNCDMP mpf_add(r,r,tmp) tot=0.010823055 avg=0.000001033 cnt=10,475
tayhyp: BNCDMP   min=0.000000713 max=0.000004003
tayhyp: BNCDMP mpf_set(R,r) tot=1.366819061
tayhyp: BNCDMP mpf_init(x2) tot=0.000000041
tayhyp: BNCDMP mpf_add_ui(tmp,N,1) tot=0.000000385
tayhyp: BNCDMP mpf_div(x,x,tmp) tot=0.000011269
tayhyp: BNCDMP mpf_mul(x2,x,x) tot=0.000098311
tayhyp: BNCDMP mpf_abs(d,d) tot=1.366190674 avg=0.000130411 cnt=10,476
tayhyp: BNCDMP   min=0.000000117 max=0.000460681
tayhyp: BNCDMP mpf_mul(x,x,x2) tot=1.296999859 avg=0.000123818 cnt=10,475
tayhyp: BNCDMP   min=0.000121270 max=0.000453871
tayhyp: BNCDMP mpf_div_ui(tmp,x,++n) tot=0.040474597 avg=0.000003863 cnt=10,475
tayhyp: BNCDMP   min=0.000003748 max=0.000011291
tayhyp: BNCDMP mpf_mul_ui(R,r,2) tot=0.000002962
23:32:22 tayhypgo: 2 10000 33296 2 -- complete

23:32:22.758210659 ph: complete (ELAPSED: 00:00:08.522868633)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 为什么我们的软件在虚拟化条件下运行得这么慢? - Why would our software run so much slower under virtualization? 为什么嵌套for循环比展开相同代码要慢得多? - Why is this nested for loop so much slower than unrolling the same code? 为什么数字运算程序在分成NaN时开始运行得慢得多? - Why a number crunching program starts running much slower when diverges into NaNs? valgrind:为什么我的小编程分配了这么多空间? - valgrind: Why is my tiny programming allocating so much space? 为什么这些方法会使我的程序变得更大? - Why do these methods make my program so much larger? 为什么写入内存比读取内存慢得多? - Why is writing to memory much slower than reading it? 为什么 C 比 Java 慢得多? - Why is C is much slower as compared to Java? 在这种情况下,python比C慢得多的原因是什么? - What is the reason that python is so much slower than C in this instance? 太多的内存被分配,以至于当我的程序运行得足够深时,我被“杀死” - So much memory being malloc'd that I get “Killed” when running my program deep enough 为什么虚拟内存中分配的块从一次运行到另一次运行的变化如此之大? - Why do allocated blocks in virtual memory vary so much from one run to the other?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM