stephenbrooks.orgForumMuon1GeneralAnyone got any benchmarking data for v4.2?
Username: Password:
Search site:
Subscribe to thread via RSS
Stephen Brooks
2002-08-04 14:27:26
I've just looked at the graph of total particles simulated against time, and it looks like v4.2 simulates more particles than the previous ones... The graph takes an upward turn around where I released v4.2 - reckoning on v4.13's benchmark that makes the Athlon-equivalent speed over the last 7 days just over 100GHz, but I think it's more like 50GHz because of the version-change.

Benchmarking 4.2 is more difficult because strictly people ought to give me the total number of particles simulated (i.e. add up the particle amounts on a load of results) because it varies between simulations.  Anyway, the figure of interest is particles per second per GHz (quote your CPU type).


"As every 11-year-old kid knows, if you concentrate enough Van-der-Graff generators and expensive special effects in one place, you create a spiral space-time whirly thing, AND an interesting plotline"
Bluumi [SwissTeam.NET]
2002-08-05 02:10:19
Hi Stephen ...

I make 874 Muons with 29'380'086 Particles on 10 PIII 500Mhz in 41 Hours

This say i had 40 particles/Sec/Ghz.  Right?

See You
gogomaus
2002-08-05 08:18:00
Hello Stephen,
full confirmation to Bluumi´s performance.
My first 288 runs did simulate some 9,152,000 particles during 72 hours on my PIII-900MHz --> 39,2 part./GC.
31778 particles per run at 15 minutes with muon-yield of about 0.1%.
This morning I watched a superlong background run with auto.sav after 70 min !
Unfortunately I had to leave before finished, but looks like a very big fish cool
gogomaus
2002-08-05 10:52:12
Hello,
just coming back home I could fetch the last run lasting 2 hours and a minute !!
muon-yield was 0.864% and 44,940 particles were simulated.
If I take this one I come down to a rate of 6.9 particles per GC (P3, 900 MHz).

Obviously benchmarking is non-linear, looks like exponential increase in time when calculating better designs.  I´m a little bit scary to spend 8 or 12 hours for one run to reach muons > 2%, while only some 50,000 or 60,000 particles are simulated (--> 1 particle/GC).

Is there achance to reduce calc.  time without loosing much parameter freedom ?
Stephen Brooks
2002-08-05 12:54:55
quote:
Originally posted by gogomaus:
Hello,
just coming back home I could fetch the last run lasting 2 hours and a minute !!
muon-yield was 0.864% and 44,940 particles were simulated.
If I take this one I come down to a rate of 6.9 particles per GC (P3, 900 MHz).

Obviously benchmarking is non-linear, looks like exponential increase in time when calculating better designs.


Well it looks like it's linear, but not linear to the particle count!  Rather, from what you've said, the calculation time appears to be proportional to the muon yield: 15mins for about 0.1% but just over 2 hours for over 0.8%.

This is going to happen because the particles that make it through the whole accelerator will take up much more calculation time than the ones hitting the first solenoid.

quote:
I´m a little bit scary to spend 8 or 12 hours for one run to reach muons > 2%, while only some 50,000 or 60,000 particles are simulated (--> 1 particle/GC).

Is there achance to reduce calc.  time without loosing much parameter freedom ?


The point of releasing v4.2 after v4.1 was to do a more difficult/calculation-intensive optimisation.  I had expected people to notice that the units took much longer from the start, but because of the difficulty of the optimisation, there were a lot of very 'lossy' designs which could be ruled out quickly.  Things are certainly going to get slow when the good designs come up - probably for a design that beats the v4.1 high (say 2.4%) about 6 hours on your computer.  That is why I put the auto-save feature in.  The particle count will increase for such a design, but unfortunately not quite enough to balance the extra work done.

The 'fair' measure of performance is actually particle-timesteps.  That'd give huge figures though (tens of millions per simulation!) and would also require me to rewrite a lot of the stats generation stuff.  There are plans for a version 5 in a few months from now, where I might include that as a more fair way of scoring.  (Say 1 'point' per 10^6 particle-timesteps).


"As every 11-year-old kid knows, if you concentrate enough Van-der-Graff generators and expensive special effects in one place, you create a spiral space-time whirly thing, AND an interesting plotline"
Pascal
2002-08-07 12:04:19
Unless it is a bit difficult to count particles for most participants, there is a possibility to count the particles by using MS Excel for extracting the particles from every single simulation. 
But I find, it is not senseful to take the time, when there are so many simulations with a yield less that 0.300000 percent.
Is my meaning acceptable?
What do you think in general?

___________________________
1: Athlon TB-C, 1.2 GC/s, 256 MB DDR-RAM, Erazor x², ADSL-Flatrate, NIC Intel, Win 98 SE Mainboard MSI-6380 Rev.  1
2: Pentium III, 600 MC/s, 256 MB RAM, NIC Intel, Win 98 SE
pben
2002-08-07 18:11:58
I did a 10 hour run on my dual 1.2GHz Athlon PC.  It did 4783 particals/min or 286,995 particals/hour.  This is with two background moun1v42 clients. 

Nice to have my PC back together after running down my hardware problem (bad memory).
Stephen Brooks
2002-08-08 02:44:46
Well it looks like it's rather difficult to benchmark v4.2x because of the results-varying-with-muon-percentage feature.  I'll probably have to remove the GHz count from the page until version 5 when I'll change the results format a bit to include a more fair quantity than number of particles.


"As every 11-year-old kid knows, if you concentrate enough Van-der-Graff generators and expensive special effects in one place, you create a spiral space-time whirly thing, AND an interesting plotline"
: contact : - - -
E-mail: sbstrudel characterstephenbrooks.orgTwitter: stephenjbrooksMastodon: strudel charactersjbstrudel charactermstdn.io RSS feed

Site has had 25160583 accesses.