Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Search near init params and some mini fixes #985

Merged
merged 6 commits into from
Nov 24, 2022
Merged

Search near init params and some mini fixes #985

merged 6 commits into from
Nov 24, 2022

Conversation

YamLyubov
Copy link
Collaborator

@YamLyubov YamLyubov commented Nov 17, 2022

Now PipelineTuner considers initial parameters of a pipeline. To achive that, at first search is conducted using search space with fixed initial parameters. After that search is continued on the whole initial search space.

Comment on lines 48 to 49
pipeline_copy = deepcopy(pipeline)
tuned_pipeline = tuner.tune(pipeline_copy)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А почему тут это важно? Пусть бы и менялся in-place.

Comment on lines 62 to 63
init_trials_num = min(int(self.iterations * 0.1), 10) \
if (self.iterations >= 10 and not is_init_parameters_full) else 1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Тут для лучшей читаемости лучше будет на несколько строк разбить.

Comment on lines +65 to +79
fmin(partial(self._objective, pipeline=pipeline),
initial_parameters,
trials=trials,
algo=self.algo,
max_evals=init_trials_num,
show_progressbar=show_progress,
early_stop_fn=self.early_stop_fn,
timeout=self.max_seconds)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А это так и задумано что fmin ничего не возвращает? Если да, то стоит пояснить.

Comment on lines +29 to +35
try_initial_parameters = init_parameters and self.iterations > 1

if try_initial_parameters:
trials, init_trials_num = self._search_near_initial_parameters(pipeline, init_parameters,
is_init_params_full, trials,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Тут нужен какой-то комментарий, поясняющий что происходит. А то потом сложно будет вспомнить, зачем это.

for key in parameters_dict:
if key in initial_parameters:
value = initial_parameters[key]
init_params_space[key] = hp.pchoice(key, [(1, value)])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Вот тут с hp.pchoice не очень понятно.

Comment on lines 96 to 97
tunable_initial_params = {f'{node_id} || {operation_name} | {p}':
node.parameters[p] for p in node.parameters if p in tunable_node_params}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Может генерацию строки с разделителями вынести в какую-то функцию с говорящим названием?

Copy link
Collaborator

@nicl-nno nicl-nno left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А в целом вроде работает, после правок можно вливать.

Можно ещё добавить в строчку

2022-11-22 15:57:16,587 - FEDOT logger - Final pipeline: {'depth': 2, 'length': 4, 'nodes': [rf, scaling, normalization, pca]}

вывод гиперпараметров для понятности.

А также в начале работы тюнера выводить начальное приближение (тоже с гиперпараметрами).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants