{{ $ctrl.sampleSizeHelp }}
Using a fixed random seed allows for reproducible result
Number of cores used for parallel training. Using more cores leads to faster training but at the expense of more memory consumption, especially for large training datasets.
Computing with new sample settings will erase the previous computations