Rethinking Dependencies in the AI Era
There's a question that every developer and system architect should be asking themselves in 2026: In the age of AI code generation, do we still need all those npm packages we've grown dependent on?
The answer isn't a simple yes or no. But I believe AI is fundamentally shifting the balance in how we should think about external dependencies, particularly for enterprise applications that need to remain stable and maintainable for years.
The Dependency Dilemma
Let's be clear: some external libraries are absolutely essential and will remain so. Frameworks like React, Vue, or Angular represent massive ecosystems with thousands of person-hours invested in testing, optimization, and edge cases. You don't want to rebuild React from scratch, and you shouldn't.
But then there's the other category: smaller libraries for text formatting, data serialization, validation, date manipulation, form handling, and hundreds of other "utility" functions. This is where things get interesting.
Every time we add an external dependency to our project, we're making a bet on:
- Continued maintenance - Will the maintainers still care in two years?
- Stability - Will they decide to rewrite everything with breaking changes?
- Compatibility - Will it work with the next version of Node, React, or TypeScript?
- Security - Will vulnerabilities be patched promptly?
- Bundle size - Are we importing megabytes for a simple function?
In enterprise development, especially in conservative corporate environments, we need stability above all else. We don't want to discover six months into development that a critical dependency is abandoned, or that version 2.0 requires a complete rewrite of our codebase.
The Traditional Trade-off
Historically, the calculation was simple: writing custom solutions was expensive. Why spend developer time and money building your own validation library when you could install one with a single command?
npm install some-validator - Done. Problem solved. Ship the
feature.
This made perfect sense when developer time was the constraining resource. Building even moderately complex functionality from scratch required hours of coding, testing, and debugging. The risk of taking on an external dependency was worth it compared to the cost of building it yourself.
AI Changes the Equation
But here's what's different now: AI has drastically reduced the cost of writing custom code.
Today, you can describe what you need to an AI coding assistant, and within minutes have a working implementation that does exactly what you want. Need a date formatter with specific requirements for your business logic? AI can generate it. Need validation rules for your specific domain? AI can write it. Need a custom serialization format? Done.
The time and effort required to build small to medium-sized functionality has collapsed. What used to take hours or days now takes minutes.
"When the cost of building goes down, the value proposition of buying changes fundamentally."
The New Decision Framework
This shift means we need a new framework for deciding when to use external dependencies versus writing our own solutions. Here's how I think about it now:
Use External Libraries When:
- Complexity is extremely high - Full frameworks, database drivers, cryptography libraries
- Standards compliance matters - OAuth implementations, HTTP clients, protocol handlers
- The library is battle-tested - Millions of downloads, years of production use
- Active, professional maintenance - Corporate backing or established foundation support
- Ecosystem integration is critical - Core dependencies that other tools rely on
Write Custom Solutions When:
- Functionality is small to medium - Can be implemented in a few hundred lines
- Requirements are specific - You need exactly this behavior, not 90% of it
- Long-term stability matters - Enterprise apps that must run unchanged for years
- The library seems abandoned - Last update was years ago, issues pile up
- You're importing huge packages for tiny features - Using 1% of a library's functionality
- The dependency tree is concerning - One package pulls in 50 others
Real-World Example
Let me give you a concrete example from a recent project. We needed email validation for a registration form. The traditional approach would be to install a validation library:
npm install validator
import { isEmail } from 'validator'
But this package has dozens of functions we don't need, adds to our bundle size, and introduces a dependency we have to maintain and update. Is someone going to maintain this package in 5 years? Will it work with Node 25?
Instead, I asked an AI assistant to write a custom email validator with our specific requirements. Two minutes later, I had a well-tested, documented function that does exactly what we need, nothing more, nothing less. It's ours. We control it. It won't break in a major version update. It won't be abandoned.
The Ownership Advantage
There's another benefit to AI-generated custom code that's often overlooked: complete ownership and understanding.
When you use an external package, you're trusting code you haven't read and probably don't fully understand. When something breaks, you're dependent on others to fix it. When you need a slight modification, you either fork it (creating maintenance burden) or work around it (creating technical debt).
With AI-generated custom code, you get:
- Full visibility - You can read and understand every line
- Easy modification - Need to change behavior? Just edit it
- No version conflicts - It's not coupled to external update cycles
- Guaranteed stability - It only changes when you decide to change it
- Learning opportunity - Your team understands how it works
The Stability Imperative
In conservative enterprise development, stability is paramount. When you're building applications that need to run reliably for 5, 10, or 15 years, every external dependency is a potential time bomb.
I've seen projects grind to a halt because a critical dependency was abandoned. I've watched teams spend weeks upgrading packages because of breaking changes they never asked for. I've debugged subtle bugs introduced by "minor" version updates.
With AI, we now have a realistic alternative: write the code ourselves. Not because we're against open source or collaboration, but because for certain types of functionality, owning the code provides better long-term outcomes.
Practical Guidelines
If you're adopting this approach, here are some practical guidelines:
- Be selective - Don't go crazy and avoid all dependencies. Focus on the small to medium utilities.
- Document AI-generated code - Add comments explaining what it does and why you built it custom.
- Write tests - AI can generate these too. Don't skip testing just because it was easy to write.
- Review carefully - AI makes mistakes. Have experienced developers review before merging.
- Keep it simple - If the AI-generated solution is too complex, that's a signal you might need a library.
- Maintain a library of custom utilities - Build up your own trusted collection of functions.
A Cultural Shift
This represents a cultural shift in how we build software. For years, the JavaScript ecosystem encouraged "install everything" - there were famous examples of packages with just a few lines of code getting millions of downloads.
But that culture emerged in a pre-AI world where writing code was expensive. We're now in a different world, and our practices need to evolve.
I'm not advocating for NIH (Not Invented Here) syndrome, where teams arrogantly refuse to use anything external. I'm advocating for a more nuanced, context-aware decision process that weighs AI-enabled custom development as a legitimate option, especially for enterprise applications prioritizing long-term stability.
Looking Forward
I believe this trend will accelerate. As AI coding tools become more capable and more developers gain experience using them effectively, we'll see:
- Fewer micro-packages - Less need for single-purpose utility libraries
- More internal libraries - Companies building collections of custom utilities
- Higher bar for dependencies - External packages will need stronger justification
- Consolidation around major frameworks - The truly essential libraries will remain dominant
- Better code ownership - Teams understanding their codebases more deeply
This doesn't mean external dependencies will disappear. It means the balance is shifting. The scales that once heavily favored "npm install" are becoming more even, and for many use cases, tipping toward "let's build it ourselves."
Conclusion
Every system architect and senior developer should be rethinking their dependency strategy in light of AI capabilities. The old rules about when to build versus buy are changing.
For small to medium functionality, AI has made custom development fast, cheap, and practical. When you combine this with the stability benefits of owning your code, the equation shifts dramatically, especially for long-lived enterprise applications.
The question isn't "do we still need external libraries?" It's "which external libraries do we actually need?" And increasingly, the answer is: fewer than we think.
AI doesn't just make us write code faster. It changes what code we should write in the first place.