Hyper-parameter tuning is a crucial step in improving the performance of neural networks, particularly in image classification. It involves systematically searching for optimal hyper-parameters such as learning rate, optimizer, and batch size to achieve better results. The search can be done using various strategies including coarse to fine search, random search, grid search, and Bayesian optimization. In this tutorial, a demo suite is provided to showcase how to find the optimal optimizer and learning rate using the tuner function in TensorFlow. A good practice for hyper-parameter search includes sampling hyper-parameters in scale space, using random search instead of grid search, searching from coarse to fine, and considering Bayesian optimization. The demo provides a YAML configuration file that defines the number of trials, fixed parameters, and hyper-parameters to be searched, allowing users to tune the optimal hyper-parameters for ResNet32 on CIFAR10.