With the advent of LLMs, multimodality and “general purpose” AIs which seat on unimaginable amounts money, computing power and data. I’m graduating and want to start a PHD, but feel quite disheartened given the huge results obtained simply by “brute-forcing” and by the ever-growing hype in machine learning that could result in a bubble of data scientists, ML researchers and so on.
When I did my PhD, starting around 20 years ago, things were different. Feature engineering within a data domain was a pretty common way to specialize. Neural networks were old fashioned function approximators that went out with shoulder pads. The future was structred Bayes nets, or whatever was going to replace conditional random fields.
I’d listen to my PIs talk about how things were different - how I was leaning too much on the power of models to just learn things, and I had to focus more on the precise choice of model. When I pointed out that given the right feature engineering, the models basically performed the same, they’d dismiss that, saying I was leaving a lot on the table by not deriving a more fit-to-purpose model.
These days, I look at the work the junior modelers I supervise perform, and I urge them to at least look at the data, because you can’t really understand the problem you’re working without getting your hands on the data. But they’d rather focus on aggregate performance metrics than examine their outliers. After all, with large datasets, you may not be able to even look at all your outliers to recognize patterns. And how could you do as good a job as one of today’s giant networks?
Then there’s LLMs, where you may be handed something that can already basically solve your problem. That flips even more ways of working on their ear.
But the fact is, these patterns repeat.
You’re going to study something that won’t be as relevant in 20 years. That’s the way of all rapidly moving fields. And it’s okay. All code ever written will one day be dust. Even if we don’t like to admit it, the same is true of every publication. Industry or Academia, you build things that move us forward, slowly, in fits and starts. That’s the broader game you opt to play in this field.
ML will change. So will everything else.