stephenbrooks.orgForumMuon1General15 percent???
Username: Password:
Search site:
Subscribe to thread via RSS
Jwb52z
2003-07-27 11:48:20
The front page of the website on the chart says that the highest percentage so far is over 15 percent.  Is that correct or another mistake?
Diabolos.de
2003-07-28 00:58:13
My best 15,77248
Stephen Brooks
2003-07-28 04:55:37
That's interesting - did your client re-check that one?  (i.e. look at the result: does it have #runs=005; in it?)

Today's weather in %region is Sunny/(null), max.  temperature #NAN°C
Herb[Romulus2]
2003-07-28 10:28:08
quote:
That's interesting - did your client re-check that one? 
Maybe I got that wrong, but we have already 250 results better than 15.40 up to 15.79 rechecked results.  That's where we're just work along, them first 15.8x are comming already.  Peaks in the queue.txt show already up to 15.88

Grab the toplists from either:
http://stephan202.qik.nl/ top250/500/1000
http://webwi.de/data/results.dat 250 rechecked results only
http://webwi.de/results.dat a new promising breed only with 29 solenoids

-------------------------------
I'd say more, but I can't reach the keyboard from the floor.
Stephen Brooks
2003-07-28 16:31:26
Sorry I must have briefly lost the plot, being on holiday.

Those results on your lists have most of the parameters pushed all the way to 000 or 999, which is probably what we could have expected on this optimisation given I haven't put any stringent conditions on the end of the beamline.  I could try one with an aperture restriction (final solenoid must only be 10cm radius) next - should do something more interesting.

Anyway the fact the muon %age has been on such a gradual incline of late has to do with the behaviour of my optimiser in situations like this, when you just want to ascend a hill whose gradient is expressed across say 100 different parameters.  It goes about 50 times slower than it really ought to on certain gradient directions.  When I get back to work the first priority is going to have to be (v5) putting a new design and components through with the existing optimiser, but I hope I'm going to get time soon after that to try a few more interesting optimisation techniques - ones that really use the results.dat more intelligently than what we've been doing up until now.  I'm talking about some sort of scheme whereby Muon compares result points with nearby ones and estimates a local gradient and tries to ascend that.  This is easier said than done with noisy data, but even so the payoffs could be worth it.

Today's weather in %region is Sunny/(null), max.  temperature #NAN°C
MaFi
2003-07-29 00:40:37
hi
as far as i understood monte carlo simulations, the noise can easily be reduced by increasing the number of particles.  i think this project has enough computing power for a increase, let's say by a factor of 3 - 5 Smile

markus
Herb[Romulus2]
2003-07-29 11:52:52
quote:
final solenoid must only be 10cm radius

15.804945,15.783804,15.712722 and current temp-file falling Red Face
Doesn't look so promising, but has a pretty good variance at least Big Grin

-------------------------------
I'd say more, but I can't reach the keyboard from the floor.
Herb[Romulus2]
2003-07-30 00:05:11
Improved in the finish to finaly 15.772544 (1779.1 Mpts), It's worth to play a bit more with it.

-------------------------------
I'd say more, but I can't reach the keyboard from the floor.
Stephen Brooks
2003-08-01 06:53:52
quote:
Originally posted by MaFi:
as far as i understood monte carlo simulations, the noise can easily be reduced by increasing the number of particles.  i think this project has enough computing power for a increase, let's say by a factor of 3 - 5 Smile


Yes, the trouble is that to reduce the noise by a factor of X you need to increase the number of particles by a factor of X^2. So we're on the wrong side of a square law.  I've already actually increased the effective number of particles in this simulation a few times: in early versions I think I went 1000 to 5000 to 21000 particles, then later on I added a multiple-decays feature that made this behave more like 100k particles.  Now with the optional rechecking, a 5x rechecked result is virtually done with 500k particles.

Really, the noise is going to be an issue to an algorithm regardless of how small it is, so I've just got to figure out a way of coping with it adequately...

Today's weather in %region is Sunny/(null), max.  temperature #NAN°C
David
2003-08-03 17:50:48
quote:
Originally posted by Stephen Brooks:
[snip]
When I get back to work the first priority is going to have to be (v5) putting a new design and components through with the existing optimiser, but I hope I'm going to get time soon after that to try a few more interesting optimisation techniques - ones that really use the results.dat more intelligently than what we've been doing up until now.  [snip]



You might consider an alternative scoring scheme, one based perhaps on particle-metres rather than a pure particle count - that way designs with a good front-end don't get completely rejected because the very last solenoid (or whatever) eats them all.  It might provide useful "breeding stock" when the genetic algorithm choses to interpolate between two runs.
Stephen Brooks
2003-08-04 04:39:20
That sort of idea came up in this thread, post 2 paragraph 2. I'll certainly use something of the sort if the current score-rating doesn't seem to be giving sufficient 'slope' for the optimiser to climb.

Today's weather in %region is Sunny/(null), max.  temperature #NAN°C
David
2003-08-04 14:01:15
quote:
Originally posted by Stephen Brooks:
That sort of idea came up in <a href='http://www.stephenbrooks.org/groupee/forums?a=tpc&s=724606111&f=8226012111&m=1486021843'>this thread</a>, post 2 paragraph 2.
[snip]



Ah, I see - thast'll teach me to drop out for a while then not read
up before opening my mouth.

David
Stephen Brooks
2003-08-05 06:24:41
Well to tell the truth it took me several minutes to find that post in this forum, being as it was buried in a conversation about other things.  But I was sure I'd said something of the sort before.

Yesterday I wrote a program that implements in general a sort of neural network called a 'Perceptron', which can train on data and recognise patterns in it.  Neural nets and similar things aren't massively intelligent, but they could be useful for our problem since they tend to be good at detecting patterns in highly-multidimensional data.  I'm also comparing how they work with some other optimisation algorithms - a sort of hybrid approach might actually be best.

It doesn't make any sense: that's why they call it "virtual"
: contact : - - -
E-mail: sbstrudel characterstephenbrooks.orgTwitter: stephenjbrooksMastodon: strudel charactersjbstrudel charactermstdn.io

Site has had 17131076 accesses.