What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?
I don’t know why people keep commenting in spaces they’ve never worked in.
No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning. These things are running 24/7/365 with UU tracking specifically to keep them up.
What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?
I don’t know why people keep commenting in spaces they’ve never worked in.
No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning. These things are running 24/7/365 with UU tracking specifically to keep them up.
What are you talking about? Who said anything close to “we have enough data, guys”?
Are you ok? You came in with a very snippy and completely wrong comment, and you’re continuing with something completely random.