Skip to content

Commit 396ff54

Browse files
committed
suggestions from review
1 parent b01e558 commit 396ff54

File tree

4 files changed

+32
-32
lines changed

4 files changed

+32
-32
lines changed

autoPyTorch/pipeline/base_pipeline.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -300,8 +300,7 @@ def _get_hyperparameter_search_space(self,
300300
def _add_forbidden_conditions(self, cs: ConfigurationSpace) -> ConfigurationSpace:
301301
"""
302302
Add forbidden conditions to ensure valid configurations.
303-
Currently, Learned Entity Embedding is only valid when encoder is one hot encoder
304-
and CyclicLR is disabled when using stochastic weight averaging and snapshot
303+
Currently, CyclicLR is disabled when using stochastic weight averaging and snapshot
305304
ensembling.
306305
307306
Args:

autoPyTorch/pipeline/components/setup/network_embedding/LearnedEntityEmbedding.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,7 @@ def __init__(self, config: Dict[str, Any], num_categories_per_col: np.ndarray, n
2323
"""
2424
Args:
2525
config (Dict[str, Any]): The configuration sampled by the hyperparameter optimizer
26-
num_input_features (np.ndarray): column wise information of number of output columns after transformation
27-
for each categorical column and 0 for numerical columns
26+
num_categories_per_col (np.ndarray): number of categories per categorical columns that will be embedded
2827
num_features_excl_embed (int): number of features in X excluding the features that need to be embedded
2928
"""
3029
super().__init__()

autoPyTorch/pipeline/tabular_classification.py

Lines changed: 15 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -51,20 +51,21 @@ class TabularClassificationPipeline(ClassifierMixin, BasePipeline):
5151
It implements a pipeline, which includes the following as steps:
5252
5353
1. `imputer`
54-
2. `encoder`
55-
3. `scaler`
56-
4. `feature_preprocessor`
57-
5. `tabular_transformer`
58-
6. `preprocessing`
59-
7. `network_embedding`
60-
8. `network_backbone`
61-
9. `network_head`
62-
10. `network`
63-
11. `network_init`
64-
12. `optimizer`
65-
13. `lr_scheduler`
66-
14. `data_loader`
67-
15. `trainer`
54+
2. `column_splitter
55+
3. `encoder`
56+
4. `scaler`
57+
5. `feature_preprocessor`
58+
6. `tabular_transformer`
59+
7. `preprocessing`
60+
8. `network_embedding`
61+
9. `network_backbone`
62+
10. `network_head`
63+
11. `network`
64+
12. `network_init`
65+
13. `optimizer`
66+
14. `lr_scheduler`
67+
15. `data_loader`
68+
16. `trainer`
6869
6970
Contrary to the sklearn API it is not possible to enumerate the
7071
possible parameters in the __init__ function because we only know the

autoPyTorch/pipeline/tabular_regression.py

Lines changed: 15 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -53,20 +53,21 @@ class TabularRegressionPipeline(RegressorMixin, BasePipeline):
5353
It implements a pipeline, which includes the following as steps:
5454
5555
1. `imputer`
56-
2. `encoder`
57-
3. `scaler`
58-
4. `feature_preprocessor`
59-
5. `tabular_transformer`
60-
6. `preprocessing`
61-
7. `network_embedding`
62-
8. `network_backbone`
63-
9. `network_head`
64-
10. `network`
65-
11. `network_init`
66-
12. `optimizer`
67-
13. `lr_scheduler`
68-
14. `data_loader`
69-
15. `trainer`
56+
2. `column_splitter
57+
3. `encoder`
58+
4. `scaler`
59+
5. `feature_preprocessor`
60+
6. `tabular_transformer`
61+
7. `preprocessing`
62+
8. `network_embedding`
63+
9. `network_backbone`
64+
10. `network_head`
65+
11. `network`
66+
12. `network_init`
67+
13. `optimizer`
68+
14. `lr_scheduler`
69+
15. `data_loader`
70+
16. `trainer`
7071
7172
Contrary to the sklearn API it is not possible to enumerate the
7273
possible parameters in the __init__ function because we only know the

0 commit comments

Comments
 (0)