Skip to content

Add adaptive mutation for ES#129

Open
RomanSraj wants to merge 9 commits intomasterfrom
add_adaptive_mutation_for_es
Open

Add adaptive mutation for ES#129
RomanSraj wants to merge 9 commits intomasterfrom
add_adaptive_mutation_for_es

Conversation

@RomanSraj
Copy link
Copy Markdown
Collaborator

Using HalfnormIncrease or UniformIncrease for mutation while using ES was causing issues with discovering new rules. Once a peak was reached (in terms of the fitness of a rule), following rules' fitness were quickly declining with each iteration.

This PR introduces an adaptive component to the mutation of ES with either HalfnormIncrease or UniformIncrease. Once a peak is reached and the rules' fitness begins to decrease, the mutation type switches from the original HalfnormIncrease or UniformIncrease to Normal.

@RomanSraj RomanSraj requested a review from heidmic October 12, 2022 10:58
Comment thread suprb/optimizer/rule/es/es.py Outdated
@RomanSraj RomanSraj force-pushed the add_adaptive_mutation_for_es branch from a6a4a00 to 08ad29c Compare October 12, 2022 17:15
Comment thread suprb/optimizer/rule/es/es.py
Comment thread suprb/optimizer/rule/es/es.py
Comment thread suprb/optimizer/rule/generation_operator.py Outdated
Comment thread suprb/optimizer/rule/mutation.py Outdated
Comment thread suprb/optimizer/rule/mutation.py Outdated
Comment thread suprb/optimizer/rule/mutation.py Outdated
Comment thread suprb/optimizer/rule/mutation.py
Comment thread suprb/optimizer/rule/mutation.py Outdated
else:
self.number_of_worse_iterations = 0
self.best_elitist_fitness = elitist_fitness
self.operator_ = self.operator
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this line is not needed

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's needed, because we set the mutation operator during runtime (I think). If we leave it out, self.operator_ has no matching_type and therefore rules aren't mutating at all

Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this happens I suspect the initial setting of the operator is somewhat broken. Other than the reset each cycle, this should not be needed at all. From a design perspective, we set the operator_ to some RuleMutation in init, change it after some criterion evaluates to true and reset it at the beginning of a new cycle. However, we set it much more often than that with the current implementation

Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

at the very least we should clone it, don't we?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes there were multiple issues:

  • _validate_matching_type needs to be called before _validate_rule_generation
  • We need to specifically set matching_type when we create the model

With these points changed, it works without assigning the operator

Comment thread suprb/optimizer/rule/mutation.py Outdated
Comment thread suprb/optimizer/rule/mutation.py Outdated
Comment thread suprb/optimizer/rule/mutation.py Outdated
Comment thread suprb/optimizer/rule/es/es.py
Comment thread suprb/optimizer/rule/mutation.py Outdated

class AdaptiveMutation(RuleMutation):
"""Start off by using a given operator until the fitness decreased
more than self.number_of_worse_iterations times. Then change the mutation operator
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should only work with & or , replacement strategies, shouldn't it? So we should probably note that here. Or maybe even better, don't require the fitness to have decreased but rather not to have increased? That way this would work for + ES as well (albeit likely with a different tolerance required)

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Comment thread suprb/optimizer/rule/mutation.py Outdated
"""

def __init__(self, matching_type: MatchingFunction = None,
sigma: Union[float, np.ndarray] = 0.1,
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this sigma have an effect or is it just a dummy? we could just parse operator's sigma to super, couldn't we?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Comment thread suprb/optimizer/rule/mutation.py Outdated
else:
self.number_of_worse_iterations = 0
self.best_elitist_fitness = elitist_fitness
self.operator_ = self.operator
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

at the very least we should clone it, don't we?

Comment thread suprb/optimizer/rule/mutation.py Outdated
self.operator_ = self.operator

def unordered_bound(self, rule: Rule, random_state: RandomState):
raise TypeError("AdaptiveMutation has no implementation for unordered_bound")
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use NotImplementedError if the message conveys exactly that? But I think that we should give more explanation here. It is not implemented because we didn't do it (yet) but because it makes no sense to do it as this is only a wrapper

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@RomanSraj
Copy link
Copy Markdown
Collaborator Author

RomanSraj commented Sep 10, 2024

Add try catch to go into default mode if not OrderedBound

@RomanSraj RomanSraj force-pushed the add_adaptive_mutation_for_es branch from 6f95342 to bbfa81e Compare September 11, 2024 21:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants