My Favorite Papers

A home for my favorite papers.

Author
Affiliation
Published

January 1, 2000

ML Fundamentals

Some Fundamental Aspects about Lipschitz Continuity of Neural Networks

Source: https://arxiv.org/pdf/2104.13478

  • Claim to study whether common networks have an inherent Lipschitz behavior, not networks that are trained to exhibit this behavior.
  • Study the change of Lipschitz constant from initialization
Bias/Variance is not the same as Approximation/Estimation

Source: https://openreview.net/forum?id=4TnFbv16hK

  • Bias/variance decomposition is the not the same approximation error/estimation error decomposition.

Citation

BibTeX citation:
@online{aswani2000,
  author = {Aswani, Nishant},
  title = {My {Favorite} {Papers}},
  date = {2000-01-01},
  url = {https://nishantaswani.com/articles/papers.html},
  langid = {en}
}
For attribution, please cite this work as:
Aswani, Nishant. 2000. “My Favorite Papers.” January 1, 2000. https://nishantaswani.com/articles/papers.html.