• 0 Posts
  • 1 Comment
Joined 1 year ago
cake
Cake day: November 11th, 2023

help-circle
  • The main trick is learning to filter out the bs “attention aware physics informed multimodal graph centric two-stage transformer attention LLM with clip-aware positional embeddings for text-to-image-to-audio-to-image-again finetuned representation learning for dog vs cat recognition and also blockchain” papers with no code.

    That still leaves you with quite a few good papers, so you need to focus down into your specific research area. There’s no way you can keep caught up in all of ML.