Show pageBacklinksCite current pageExport to PDFBack to top This page is read only. You can view the source, but not change it. Ask your administrator if you think this is wrong. A model with high [[error]] due to [[bias]] can fail to capture the regularities in the [[data]], resulting in an inaccurate model underfitting the data. Increasing the complexity of the model, such as adding more [[parameter]]s in the model, can reduce this [[bias]]. However, an excessively complex model, such as having too many parameters compared to the number of [[patient]]s, can describe [[random error]] or noise instead of the meaningful relationships, referred to as overfitting of the data. This results in an increase in error due to variance and a reduced generalizability to previously unseen data. The complexity of a model should, therefore, be a tradeoff between bias and [[variance]] ((Jordan MI, Mitchell TM. Machine learning: trends, perspectives, and prospects. Science . 2015;349(6245):255-260.)). random_error.txt Last modified: 2024/06/07 02:59by 127.0.0.1