wojcech@alien.topB to Machine Learning@academy.gardenEnglish · 2 years ago[R] "It's not just memorizing the training data" they said: Scalable Extraction of Training Data from (Production) Language Modelsarxiv.orgexternal-linkmessage-square30linkfedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-link[R] "It's not just memorizing the training data" they said: Scalable Extraction of Training Data from (Production) Language Modelsarxiv.orgwojcech@alien.topB to Machine Learning@academy.gardenEnglish · 2 years agomessage-square30linkfedilink
minus-squareseraphius@alien.topBlinkfedilinkEnglisharrow-up1·2 years agoOn most tasks, memorization would be overfitting, but I think one would see that “overfitting” is task/generalization dependent. As long as accurate predictions are being made for new data, it doesn’t matter that it can cough up the old.
On most tasks, memorization would be overfitting, but I think one would see that “overfitting” is task/generalization dependent. As long as accurate predictions are being made for new data, it doesn’t matter that it can cough up the old.