ML Jobs, Resources, and Content for Software Engineers #15: Stay Married to the Problem, Not the Solution
An AI reading list curated to make you a better engineer: 7-8-25
Welcome to Machine Learning for Software Engineers! Each week I curate insights and resources specifically for software engineers navigating our rapidly evolving AI landscape.
What does that mean? It means this weekly newsletter contains everything software engineers should know about AI:
An overarching lesson about engineering excellence from the past week
Five carefully selected resources to accelerate your technical growth
Interesting things that happened last week
Current job market insights and promising opportunities
Subscribe to get these in your inbox each week.
If you find Machine Learning for Software Engineers helpful, consider becoming a paid subscriber for just $3/month forever for a limited time.
The most important lesson for becoming an excellent engineer
I remember a conversation I had over a year ago with some coworkers at Google during lunch where we were debating whether teams should be organized around the problems they're solving or the tech stacks they're working on.
The argument for problem-centered teams was that it makes throwing away solutions much easier when they don't work. Instead of maintaining a specific tech stack, teams focus on solving problems. The argument for teams centered around tech stacks was operational simplicity—easier on-call, code maintenance, and onboarding.
Development velocity is super important—especially in the age of AI—and the ability for a team to recognize a solution that isn't working and throw out what they've already built to pivot their strategy is fundamental to succeeding quickly.
I'm convinced centering teams around tech stacks is what makes development velocity so slow at large tech companies. Part of this is because those companies serve so many people that pivoting to a new solution is difficult, but part of this is also that the size of the company requires operational efficiency, which takes priority over development velocity.
This is frequently why when a new technology emerges, we see a new player or startup lead the field with that new technology instead of an incumbent.
The flipside is that problem-centered teams can become operationally disorganized. Examples include messy on-call where engineers cover unfamiliar tech stacks, teams solving problems from conflicting directions, or fragmented approaches that prevent collaboration.
Even something as simple as this can lead to deficiencies in the user experience, which can be harmful over the long-term.
I think the most important thing to understand is that as a software engineer, your focus should not be on the solution or the tech stack, but it should be on the problem. You should be able to drive software development from the problem statement—this is the best way to understand how to build the best software for the user.
This is especially important for machine learning systems that become complicated and solve complex problems. Solutions can very easily get away from the engineers and become too abstracted from the problems they're trying to solve.
Marc Randolph of Netflix put this perfectly in a recent tweet: Netflix started mailing DVDs, not streaming movies, but they stayed married to the problem they were working on, not the solution. This allowed them to pivot and become the streaming company we know today.
What do you think: should teams be organized around tech stacks for operational ease, or around the problems they're solving?
Resources to help you become a better engineer
Google Transforms Education with AI-Powered Teaching Tools Google has launched comprehensive AI features for Google Workspace for Education, allowing teachers to assign AI tutors to students and enabling automatic quiz generation with visual explanations. An improved educational system is better for society. This is one of the biggest areas AI can make a difference.
Understanding Reward Models: The Foundation of AI Alignment
details Reinforcement Learning from Human Feedback (RLHF) and it’s impact on LLMs. Cameron’s articles are always in-depth, well explained, and very accessible.The American DeepSeek Project Open source AI (truly open source—not just open weights) is important to ensure AI is democratized and not centrally confined within a few superpowers that are able to afford the computer and researchers to develop top-of-the-line models.
describes his vision for the near-future of open source AI in American.Small Language Models: The Future of Agentic AI Small Language Models are the future of agentic AI. This paper shows that SLMs are capable of running on edge devices and practically as capable as their LLM counterparts when performing agentic tasks.
Software Engineering with LLMs in 2025: Reality Check While CEOs claim AI is reshaping companies and increasing outcome tenfold, engineers actually working with it are claiming the opposite. As someone working with AI, I can say it’s very important to understand what it’s good at and what it’s not. Inaccurate representations actually hurt the technology long-term.
Other interesting things
The Carpentry Analogy: Why Programming Careers Aren't Disappearing Simon Willison's perspective on why quitting programming due to LLMs is like abandoning carpentry because of the table saw. Transformative tools enhance rather than replace skilled practitioners.
Tesla vs. Waymo: Different Approaches to Autonomous Systems Tesla's end-to-end vision system requires intervention every 400 miles while Waymo's modular expert system can drive 20,000 miles between interventions. This is possibly the best comparison so far of modular agentic, AI versus a one agent does all approach.
The Rise of "Plastic Software" As AI code generation becomes more common, this idea makes us think about what we're trading off when we prioritize speed over long-term maintainability. I love the phrase “Plastic Software”.
Vibe Coding Reality Check A reminder that anyone claiming "vibe coding" will entirely replace software engineers hasn't dealt with the realities of production systems.
K-Scale Labs Launches Open-Source Humanoid Robot K-Bot represents the first open-source, affordable humanoid robot made in America. Robotics advancements are a must for physical intelligence to be real and, in my opinion, the societal benefits of AI depend on this.
The Coming Research Crisis By 2026, the U.S. plans to reduce a quarter million people involved in science research and education. Let be a note how policy decisions shape the engineering talent pipeline in ways that aren't immediately visible.
Job market and opportunities
Interestingly, two of the top job opportunities come from Twitter this week. I love unconventional hiring because I think the regular hiring process is broken. The rest of the jobs come from LinkedIn and vary in role, skillset, and location.
Growth Engineer at X (Twitter)
Nikita Bier is looking for a talented growth engineer to work directly on analyzing product experiments and monitoring X's userbase health.
Deep experience writing queries on large data sets
Expertise in eliminating noise from bots
Direct collaboration with product leadership
Part-Time Engineering Role for New Moms
A refreshing approach to work-life balance offering flexibility without compromising on compensation or meaningful work.
10-25 hours per week, $100-150/hour plus equity
No mandatory meetings, work from anywhere
Tech stack: Ruby/Rails, TypeScript/Next, React
Optional NYC office visits once weekly
Machine Learning Manager, Ads Targeting - Reddit
Define and drive engineering strategy for Ads Targeting products by developing machine learning–powered features that identify and target qualified users and content.
Experience managing and growing ML engineering teams
Proven track record building large-scale, end-to-end machine learning products
Technical expertise in Machine Learning, NLP, Ranking, or Recommender Systems
Management experience in advertising domain or ad tech platforms
Deep Learning Engineer - GenBio AI (Palo Alto, CA)
Design, develop, optimize, and maintain software systems and scalable codebases for the entire lifecycle of large-scale foundation models, from data pipelines to serving and inference.
Strong Python skills and proficiency with PyTorch, HuggingFace Transformers
Expertise in distributed systems, cloud computing (AWS, GCP), containerization
Experience pre-training or serving large language models or foundation models
Proficiency in back-end frameworks and database technologies
AI Software Developer - Morgan Stanley (Alpharetta, GA)
Build and re-architect high-quality technology solutions for cybersecurity with focus on high availability, resiliency, and scalability.
Experience requirements: Associate 3+ years, Director 6+ years, VP 10+ years
Excellent Java or Python skills
Hands-on experience with major Cloud Service Providers (AWS, Azure, GCP)
Experience with Infrastructure-as-code tools such as Terraform
Machine Learning Engineer - Baselayer (San Francisco, CA)
Develop, integrate, and maintain machine learning models and systems for fraud prevention and compliance, with focus on KYC/KYB processes.
1-3 years of experience in machine learning development
Strong foundation in AI/ML fundamentals, particularly with LLMs
Experience with large-scale data and advanced ML techniques (RLHF, LoRA)
Knowledge of fairness, explainability, and compliance in regulated environments
Research Scientist - Quantum Computing (New York, NY)
Identify feasible quantum algorithms and application areas for early fault tolerant quantum devices, benchmark resource requirements, and develop compiler optimizations.
Doctorate in Engineering, Physics, or Computer Science
Experience with quantum algorithm design, quantum compilers, or computer architecture
Proficiency in Python and Rust or C++
Junior AI/ML Engineer - Lensa (Sacramento, CA / Trenton, NJ)
Assist in designing, developing, and deploying AI-driven solutions using Azure cloud services for Federal Health Client.
Six months to one year experience with GPT models or LLMs
Six months to one year experience with Python or JavaScript
Knowledge of AI/ML concepts including neural networks, NLP, or model training
Experience with Microsoft Azure (OpenAI, Functions, Cosmos DB)
Thanks for reading!
Always be (machine) learning,
Logan