I’m a data engineer who somehow ended up as a software developer. So many of my friends are working now with the OpenAI api to add generative capabilities to their product, but they lack A LOT of context when it comes to how LLMs actually works.

This is why I started writing popular-science style articles that unpack AI concepts for software developers working on real-world application. It started kind of slow, honestly I wrote a bit too “brainy” for them, but now I’ve found a voice that resonance with this audience much better and I want to ramp up my writing cadence.

I would love to hear your thoughts about what concepts I should write about next?
What get you excited and you find hard to explain to someone with a different background?

  • PhilsburyDoboy@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’m particularly excited about AI accelerating theorem provers and optimization problems (think: traveling salesman). These problems are NP-hard and scale very poorly. We would see huge efficiency gains in most industries if they scaled better. Recently there has been some very exciting research in using neural networks to accelerate and scale MILP and LP solvers.

    For reference, optimization problems include:

    • SpaceX rocket landing
    • Car navigation systems
    • Electric grid operations/markets
    • Portfolio optimization
    • Stock and options trading
    • Airline fleet operations
    • Ship/Truck logistics