01 January 2026
It's that time of the year again, when I make predictions for the upcoming year. As has become my tradition now for almost two decades, I will first go back over last years' predictions, to see how well I called it (and keep me honest), then wax prophetic on what I think the new year has to offer us.
In the large, 2025 was... well, damn. Last year at this time I wrote, "We thought 2023 was tumultuous, but then 2024 said, 'Hold my beer.'" and I find myself wanting to repeat the phrase all over again. I can't quite put my finger on what is happening, but finding myself wanting to repeat these phrases over and over again suggests that either there's a larger social element at work here (I solemnly swear I am NOT going to bring up politics), or else I've entered my "Grumpy Old Man" era. Or, possibly, both. Either way, 2025 was an "interesting" year, where "interesting" is that word you use when you really don't know what other word to use.
Just when the tech market felt like it was just maybe, sorta, starting to make sense, the whole "AI will replace developers!" thing kicked into high gear and threw us all for a wild loop. LinkedIN was bad in 2024? LinkedIN 2025 said, "Hold my beer. And, you know, here's my car keys and cellphone too--this is going hyper cray-cray." Companies were hiring. Companies were laying off. Jobs were lost to AI. No, turns out they weren't lost to AI, but lost because companies were losing money on AI projects. No, they were making money on AI projects, just not where people could see it. No, that was a lie.
You know how that Ancient Chinese Curse says, "May you live in interesting times"? Welcome to the most interesting times of my lifetime, anyway. Now you know why the Ancient Chinese considered that a curse and not a blessing.
Let's take that look back in the mirror and see what I said a year ago and how close--or off--I was.
As per previous years, I'm giving myself either a +1 or a -1 based on a purely subjective and highly-biased evaluational criteria as to whether it actually happened (or in some cases at least started to happen before 31 Dec 2025 ended). If you want to skip to the new predictions, scroll down to the next major heading.
... even more stuff than in 2024, it seems, but that's often because there's so many parallel threads that it's hard to stay focused. I'll try harder this time around. But let's start with some of the personal (and therefore easy to score) stuff.
Seriously! It will! I promise!
Result: As I write this (on December 31st, 2025), I can only say that we got so close. But we didn't get it slid in under the wire--we're still iterating on proofs with the publisher. That said, we do have a publisher page, and we can take pre-orders, and....
Yeah, OK, no, we didn't ship. -1. But, on the other hand....
Three years and counting. I wouldn't mind the "enforced retirement" so much, were it not for all these people coming around my place, telling me I owe them money.... Fortunately I keep finding the odd thing to do every so often, but it'd be nice to not have to answer questions that start with "Tell me of a time when....".
Result: +1. I picked up some consulting work at Capital One during the summer of 2025, and that in turn transformed into a full-time engagement at Capital One, so yay! Employment! Oh, and yay for a win in the predictions, but more, yay for employment!
The probability goes up if I don't have something full-time landed by the end of January.
Result: -1. I got distracted by a few things, and then the consulting thing popped up. Sorry not sorry and all that.
I have no idea which one, but I'd love for some suggestions and/or requests. I love the shows I hit regularly, but I wouldn't mind going someplace entirely new and talking to people who've never heard me speak before.
Result: -1. Nope. Didn't happen. Still want it to, though, so this'll carry over into 2026.
Look, let's be honest: Lots of people have a vested interest in seeing software development's "hold" over the creation of software broken. Startup founders desperately want to believe they don't need a CTO or any of those really expensive "full-stack" people to build out their idea. Non-FAANG tech-adjacent companies look at their labor costs and think, "Do we really need to have all these expensive people on our payroll?" AI venture capitalists (who want to see their investment make fortune quickly, so they can bail and reap the high stock price) want people to buy into the idea that AI can replace... pretty much anybody... without waiting for actual evidence of the success of replacing humans with AI. It's all going to hit a fever pitch this year, as the "pro-AI" crowd tries to drown out the growing surge of "anti-AI" evidence, then the bubble will burst, the AI startups will start to collapse, and the VCs will start hunting up their next hype to invest-pump-dump.
Result My, goodness, I guess I got half of that right. The "developers are obsolete" prose definitely hit a high note in 2025. I mean, some people took real swings at it with articles like "The Junior Developer Extinction: We’re All Building the Next Programming Dark Age". And, in some ways, yeah, the correction/caveats came right with it, such as what we see in that very same article: "Recent studies tell contradictory stories. PwC’s 2025 Global AI Jobs Barometer found that productivity growth nearly quadrupled in AI-exposed industries since 2022, rising from 7% to 27%, while a major Danish study of 25,000 workers across 7,000 workspaces found that AI chatbots “had no significant impact on earnings or recorded hours in any occupation” — with users saving just 3% of their time on average. ... Meanwhile, Microsoft research involving over 4,000 developers found that those using GitHub Copilot achieved a 26% increase in productivity, and Nielsen Norman Group studies suggest productivity gains as high as 126% for coding tasks. But here’s the kicker: according to the 2024 DORA report, speed and stability have actually decreased due to AI, and Uplevel’s quantitative study found that using Copilot didn’t result in productivity improvements but did increase the rate of bugs produced."
Can someone please make it all make sense? Developers using AI are going slower, yet companies are reporting higher productivity metrics, yet companies are reporting failed AI projects in numbers we've (literally) never heard of for a technology that still remains so popular and strategy-centric. It's baffling! It's bonkers! It's cray-cray! And it's not like the people writing about it aren't trying to make sense of it all; again, from that same article: "Meanwhile, The New Stack’s 2025 developer survey found that 'developers have become even more cynical and frankly concerned about the return on investment of AI in software development.' The honeymoon period is ending, and reality is setting in."
Except... that the reality is having a hard time setting in. LinkedIN, that finger-on-the-pulse of professional movements, still feeds me quotes and citations and examples of companies who are "all in" on AI and AI coding tools and other "agentic" things. Tons of "AI influencers" (most of whom don't appear to have ever written code in their lives unless it was of the "vibe" variety) are breathlessly telling us of how the whole profession is changing. If you're getting the same kind of feed I am, it's obvious that there's still a TON of companies sinking an exorbitant amount of money into AI-related projects... even as a growing number of tech professionals (paraphrasing Fowler, Beck, and a few others) are telling us all to "calm down" and that "AI isn't replacing developers, but it can make developers more successful".
We'll come back to this in the 2026 predictions below, but I think that the dichotomous nature of what's happening here is explainable, and justifies me giving myself a +1 on this one.
It may be this year, or it may be next year, but the AI venture-capital-gravy-train is about to run out. Fully 25% of all VC money went into "something something AI", and it's eerily reminiscient of the "dot-com" years, when a huge percentage of all investment and VC money went into "something something Internet" and gave us such wonderful profit-avoidant monstrosities like "pets.com". (Seriously. The sock puppet. It's all that's left of what was one of the most trumpeted startups in the early 2000s.) But eventually, your business has to actually make money (even you, Sam Altman), and when the profits aren't there, the shutdowns follow.
Result: I guess it all depends on how you classify "failing". First off, yes, there were failures; in fact, more than a few failed. Builder.ai was the classic con-man game, calling it AI but secretly operating humans in the background, but it wasn't always that simple. Rain AI ("revolutionary AI chips") took that classic line of "let's put the software thing into a chip", and like it's Lisp- and Java-flavored predecessors, collapsed. And, yes, lots of startups fail, but these were some pretty big ones with some pretty sizable players backing them in some pretty large ways.
It's important to call out that some of this was due to the environment, not the tech: When the VCs handed out money to (literally) everybody in 2020/2021, those companies had a 2-3 year runway, which, if you do the math (and add in an extra year of "emergency/bridge funding"), ends somewhere in 2023-2025 timeframe. They had to make good on their business, or die trying.
But the complete collapse of a startup isn't the only way it fails; Google, for example, did a talent-acquisition play with Windsurf, which raises the question, "If Windsurf fails in 2026, is that because the market isn't there, or because the company didn't execute, or because the company's key executors were scooped out?"
The bloodbath isn't over yet, I don't think, but that's for the 2026 section. For now, I'm going to call this a +1 because we definitely see some AI-centric startups that looked impregnable suddenly collapse.
In the beginning, there was the x86. And Intel rode that line into massive, generational-spanning profits. But slowly, over time, the ARM chip architecture has been slowly but steadily eroding the bulwark that x86/x64 erected ("You'd have to recompile everything! Nothing would run out of the box! Dogs and cats will merge to form a master race that will enslave humanity!"), such that by the end of 2024, not only was any shipping device without a built-in keyboard running ARM internally, Microsoft had a version of Windows running on the ARM chip. This upcoming year, there's literally no barriers left to ARM's complete dominance of the CPU market. The "Apple Silicon" MacBooks were just the vanguard in the horde of new ARM-based laptops that are going to come out and essentially take over. (As a caveat, if you're a native-language developer, as in you use C, C++, Swift, Rust, Go, Nim, Zig, and so on--if you've not started learning what ARM assembly looks like and behaves, now's a good time to start learning it.)
Result: Windows. On. ARM. +1 That's the final strut keeping the avalanche away from the ski lodge that is Intel's x86/x64 business. The numbers aren't there yet (if you can find any!), with some folks calling 2025 sales of Windows/ARM PCs to be around 10-15% of new laptop sales, but the signs are there that this is going to heat up soon. Intel's going to do everything it can to try and staunch the bleeding, but they've now reached the official "This is the moment it all went to shit" demarcation point.
Look, I've been saying this for a few years now, and I'm going to make this the last year I do so, but these languages are so mainstream right now that nobody (except for conference speakers, who constantly mine the feature lists for "What's New" talks for the coming year) really (a) pays much attention to their "v.Next" featureset, or (b) cares. The price of mainstream status is a desire for "sustainability", which means most Java shops are not moving versions unless it's an "LTS" release, and most .NET shops are in the same mindset. (BTW, most iOS shops have lost track of all the new "bright shiny"s that Swift keeps getting, and what's worse, each release seems entirely incompatible with the previous.)
Result: I'm giving myself a +1 solely because when I asked nine colleagues (some from Java, some from C#) what the new features of Java or C# were, they couldn't remember. Considering these are folks that, like me, used to track the new upcoming release info religiously, I'm taking that as a sign that both these platforms are, in a lot of ways, "done". Not "done" meaning "dead", but "done" meaning, "There's just not a lot that I desperately want the implementors to introduce".
The TypeScript language started life as an attempt to add a strongly-typed type system to the JavaScript ecosystem, in order to provide greater and greater up-front verification of code in order to get better errors earlier. (That's the general goal of any strongly-typed language, and while we can debate whether that's a good idea or not, what's not debatable is that a significant number of programmers buy into it, one of them being Anders.) But TypeScript also discovered that it could do a lot of type-level things, and TypeScript has gotten more and more complex and complicated and confusing with each release. (I've commented, elsewhere, that I lost faith in the language when the team chose not to create a language spec for it anymore, choosing instead to just say, "The code is the spec" and publish a new blog post with each release desribing new features in a casual and imprecise way.) This is the year (though it could be next year, although some have said it was already last year) that TypeScript starts compiling (or not compiling) expressions that absolutely confuse the hell out of anybody without a PhD in type theory. (For whatever it's worth, two years ago somebody put together a list of 'WTF moments', and they're... interesting.)
Result: Ehhh.....? When the language implementors announce that they're rewriting the entire language in a different (entirely unrelated) language, it definitely begs some questions. Was the compiler performance really bad enough to justify this work? Will the new compiler be bug-for-bug compatible with the old one, particularly when there's no publicly-available specification to be able to verify against? And are the developers at Microsoft so enamored of Go that they deliberately avoided some of the other languages they could easily have used (C++, if you want the native compilation target or C#/F#, particularly in conjunction with some of the CLR AOT-to-native options, all these come to mind...).
But again, truthfully? Name the last new language feature TypeScript introduced and tell me why it matters to you. Whether you agree with me on the +1 really rests on whether you can answer that with any degree of accuracy.
Quietly, while nobody was looking for it, Python introduced "type hinting", where Python code can be given Pascal-style type hints to parameters and return types, allowing tools like IDEs and editors to be able to provide better edit-time support. Most of the Python tutorials, it seems, ignored this for a while, but starting last year it seems the notion of writing type-hinted Python is becoming more and more acceptable to the Python community, though it seems that the majority of the community kept to "untyped" Python. My guess is, this is the year that the balance shifts, and "typed" Python begins to outweight the "untyped" Python. Legacy "untyped" codebases will of course exist for centuries to come, but I suspect that this coming year might be when companies start scouring their Python codebase and thinking, "Maybe we should put type hints in...?".
Result: It's hard to say, honestly. Lots of sample and demo code has sprung up using type hinting, and the coding-assistant tools love to slither it in to the code they generate, but are Pythonistas actually embracing it? Well, Meta thinks so. Pyrefly agrees. And hey, if Reddit agrees, it must be true, right?
But I'm not 100% sure I agree yet. Periodically I find interesting Python repos on GitHub that are relatively large and entirely absent any type hints whatsoever, and the rank-and-file Python developer (not to mention all the data science folks, who aren't programmers and don't necessarily see the reason for type hints) isn't always reflected in the blog posts, so.... -1. I just don't feel comfortable claiming this as a correct prediction in 2025.
JetBrains put a ton of time and energy into KMP, and thus far it looks like there's been absolutely zero uptake from anyone outside of the building. If JetBrains is going to be able to call this any kind of success, they're going to need 2025 to be the year that KMP emerged as a credible competitor to React Native or Flutter.
Result: +1. For starters, they took it to production status; for seconders, their blog was filled with comparisons to React Native and the iOS or Android native ecosystems. And, in some ways, it appears to have made something of an impact, as their own developer survey suggests that there's a non-trivial number of folks using it. (They survey was for 2024, but the analysis came out in 2025, so we have to take the data with at least a little salt.)
Speaking of which, Flutter's going to start being quiesced by Google. First there was the layoffs (200-some people in October of last year), which signaled pretty strongly that Google's "done" with Flutter, all the PR statements notwithstanding. Then the community fork that came out of that news led a number of folks to think, "Wow, Google's really done with Flutter", since nobody would think to create a community fork of something that's being actively developed and maintained by the multi-trillion-dollar entity that created it. That makes the companies who make money on the things developers build (as opposed to those companies who make money on building things for developers) get very nervous about adopting or continuing to invest in Flutter, and we can expect Flutter's usage numbers and interest level to plummet as a result. Yes, the open-source tools are there, and yes, people will continue to work on them, but hey by the way, how's the Parse community looking these days? Or LoopBack?
Result: I wasn't the only one curious about this, and when that article asked, "Is Flutter dying?" it answered itself with "No. It's evolving." That's a touch telling. On top of that, there's Flock, which is a community-led fork of Flutter, which itself suggests some amount of uncertainty about Flutter's longevity (at least as a Google-sponsored/-backed effort). (Come on, you don't create a community fork of something if the company behind it is sending out strong signals and actions towards growth and sustainability.) I've liked the technology for years, but given all this, it's hard not to see it beginning that quiet slide. +1
Like the rain in Seattle, Apple always releases a new version of Swift every year, and every year, there's something in there that's not backwards-compatible with the previous version. There is zero reason to expect that 2025 will be any different on that score.
Result: Swift 6.2 was released in September of 2025, and 6.1 in March, and 6.2 (bugs nothwithstanding) is compatible with 6.1, and 6.1 is supposed to be compatible with 6.0, though the language implementors claim that some stricter enforcement of concurrent code may yield compilation errors in 6.1 that went through OK in 6.0. Still, these aren't deliberate breaking changes, in the grand scheme of things, so... -1.
Oh, I know the downsides to C++. I really do. I programmed in the language for a half-decade (back in the "bad ol' days of pre-C++08, waaay back in the late 90s), so I'm well aware of all the dangers of pointers and what-not. But Rust isn't a high-level language, and there's way too much C and C++ code out there for companies to credibly consider rewriting all that code. We can ache and moan about computer security and stability all we like, but doing that rewrite is not a simple line-for-line replacement, precisely because of the pointer guarantees that Rust enforces--all of those pointers have to be debated, decided, and declared, and it has to be consistent across multi-million-line codebases. Rust may make some inroads in getting new system-level development, but it's not going to make C++ "go away" any time soon, not this year or the next (or the one after that, or even the one after that).
Result: Ohboy. THIS one. I was pretty confident about this one, until... boom and boom. Windows is a massive C/C++ codebase (though there's some other stuff in there too, to be sure), so if they're starting to replace the C/C++ code with Rust code, that's a pretty solid vote of confidence.
Does this mean that C++'s days are (finally) numbered and all those C++ devs should pack it in and learn... (um... what is the latest thing to learn again)? Probably not. Between the fact that the amount of C/C++ code in the world is ridiculously large enough to resist easy ports to anything, and the fact that Microsoft has done these sorts of "We're going to rewrite Windows in..." kinds of announcements before, let's give it some time before we count Rust as having successfully replaced C/C++ at Microsoft, much less anywhere else. But, it's definitely a turning point, and in fairness, I think it merits a -1.
All of these are system-level languages in the tradition of C++ or Rust, and all of them are interesting in their own rights. I suspect some of the enthusiasm for Rust will get siphoned off into these, and before long the "Rust will replace C++" crowd will turn into the "Rust/Odin/Nim/Zig will replace C++" crowd, before they start fighting among themselves which of those four will do the replacing (because, of course, there can only be one, right?).
Result: Well, Zig is definitely getting some airtime, for sure, particularly since it's what powers Bun, the high-performance JS runtime. Nim and Odin, though, seem to be trailing in popular usage, though each definitely has its ecosystem that is growing. Still, I look around and don't see any crowds chanting, so this gets me a -1.
There's been a quiet surge of interest in SQLite recently, since it ships pretty much everywhere (it's already on every mobile device on the planet, and its C codebase and bindings make it super-easy to FFI out to if there aren't already language-level bindings for your chosen language) and it's a pretty feature-complete subset of SQL-92. That's gotten a number of folks intrigued with the idea of "local database"-based functionality, which in turn has sparked some interest in "Wait, what exactly makes up a database, again...?" because if I can get the storage engine and bypass the SQL-ish query and talk directly to the query engine and.... Well, suddenly people are starting to wonder if the database has to be the black box we've always assumed it to be. I think in 2025 we're going to start cracking it apart and doing a little plug-and-play of the components as part of, or behind, their service interfaces.
Result: I'm not sure how I was expecting this to manifest, so I can't quite tell if I was right or wrong. SQLite is one of those things that's been around since forever (it's shipped on the device on both iOS and Android since each of their first release), and the fact that one can make use of it from within a browser in a variety of different ways (compile the SQLite source via emscripten to get a WebAssembly implementation directly, or use browser-native storage which is often implemented in SQLite, or use the port of SQLite to Javascript, and that's just three ways to approach the idea) makes it really hard to track.
But if we look at the intent of the prediction, what I was really going after was custom database implementations that would start gathering some interest, and I can't honestly say I saw much of that across 2025. -1
... which puts me squarely in the middle of the bell curve. Good thing this isn't my day job!
So now let's turn our head to 2026.
As I do every year, each prediction comes with a probability factor (p) that ranges anywhere from 0.0 to 1.0, with most predictions appearing in the 0.3 to 0.8 range. This is a trick I picked up from my International Relations days, where political prediction briefings often came with a similar kind of probability as a way of offering a "confidence" factor. Anything higher than a 0.8 means "We're pretty damn sure", and anything less than 0.3 means "We're including it mostly out of completeness but we don't expect it."
But before we get started.... You know how I keep writing more stuff each successive year, and apologize for it the following year? Not even going to pretend this time around, this one could be the longest yet. (And, to no one's real surprise, it's all about AI! Because... sigh) Brace yourself.
... because VC-backed startups need people to believe, so the VCs can exit the startups they've backed.
The landscape of the startup scene has been in the throes of a sea change for some time now, going back a few tech-generations (AI, blockchain, etc), and it goes like this:
Unfortunately, as Theranos showed us, this basically has a predictable cycle:
It's almost as if the tech industry made a deliberate decision to look at the Gartner Hype Curve not as a "lesson learned", but as one of those Maps of the Hollywood Stars that must be slavishly followed over and over again.
Meanwhile... We're in peak Inflated Expectations territory here with AI, folks. That means in order to keep the eyes off the bottom line, the predictions are going to only get wilder.
... because humans keep anthropomorphizing AI, and keep believing that AI can do human-sensitive things better than humans can.
Werner Vogels thinks robots can help foster companionship; I deeply reject that premise, because humans have systematically rejected machines replacing humans in all but the most desperate of cases, and because cases where people have done so, missed the mark or ended badly. Humans can still tell the difference between machines and humans, even if they can't say why they know. Worse, if humans suspect that the voice on the other end is a machine, they'll treat it wildly different.
But that said, lots of startups are going to go down this path, and while they might get some traction to start, they'll crater hard when they try to scale out. Because we are all of us lonely, and having an always-available companion seems like the easier solution than actually building empathy and connection with other humans. Certainly the more profitable one, anyway. But the results... will not help people, only balance sheets.
... because Sam Altman has made a lot of promises, and doesn't have a great track record of delivering on them (January 2025: "We know how to build Artificial General Intelligence"; November 2025: "Hey we got GPT-5.1 to use em dashes correctly when instructed (but not always)"), and 2025 saw OpenAI making even bolder promises. This is going to catch up to him in a big, big way.
Look, Sam knows how to project confidence--every CEO does at that level, it's a prerequisite of the job at this point--but that doesn't mean he knows how to actually bring it forth to fruition. We've all seen the meme of the financial shell game that Microsoft, NVidia, OpenAI, and a few others are playing with "promises of investment", or perhaps you've watched the More Perfect Union video. But Sam's playing at an entirely new level with trillions on the line, and you kinda have to wonder if he is resting much of his confidence on the idea that OpenAI is just "too big to fail" and can rely on government bailout in the event the piper does come calling.
I'm certainly no financial expert, and I don't claim to have the inside scoop on all the hallways of power back-room deals that OpenAI and SoftBank and all the other players are whipping up, but Altman's already backed away from so many deadlines and dates, before too long a lot more folks are going to notice, and you can't just keep selling whipped air forever. (See: Musk, Elon.) I suspect that in 2026, the shell games will come faster, the retrenching will come quicker, and I think Sam Altman is going to find himself at the center of a firestorm large enough to make Sam Bankman-Fried look like small potatoes.
And with that, there's going to be a whole lotta finger-pointing and crashing.
... because all those data centers that AI needs are being built on taxpayer and utility customer dime.
Next time your major urban city mayor is looking to "bring some jobs home", ask them what the domestic cost will be--for example, if they bring in a data center (which largely runs itself, compared to an automobile plant or steel mill), ask them if your electric bills will go up as a result, and if they'll agree to put legislation in place to force the company building the data center to foot the additional rise if it does. It'll spark an interesting debate at the city council meeting at least.
(Oh, and for bonus points, ask about not your utility bill, but the bill of the most impoverished part of your city. After all, data centers don't go into posh suburban neighborhoods, they go into wherever the land is the cheapest.)
... because we have started to figure out software development is not about writing code, but understanding the problem well enough to be able to articulate what code to write.
To understand the "why" of vibe coding, let's accept a couple of statements as axiomatic:
Given the above axioms, there's a simple conclusion that any macroeconomist can tell you about: As demand rises, if supply remains fixed, the equilibrium price goes up. This is why programmer salaries went through the roof during the original dotcom era--and have stayed there ever since, despite the setbacks we've seen over the last three decades.
However, one other axiom needs to be cited here to make all of it make sense:
If your business is built on software, there's a strong probability that your biggest expense is... software developers. It used to be owning and operating all those big server farms that every company had tucked away someplace (usually at the center of the building behind glass walls because damn it looks impressive to potential investors to see row upon row of machines with blinky lights...), but the cloud took care of all that 1. And, to be clear, software development labor has been a sizable part of most companies' expense sheet for (checks notes) three decades now, so the cloud thing helped some, but the original problem remains.
So if you're a software development shop, how do you cut the expenses of your software developers?
You look for ways to make software development faster. After all, faster means fewer hours means less cost (because cost = hourly rate x hours). We see it, everywhere along the way, in the way that new programming languages are marketed, new tools are sold, or new practices are introduced. They might vary in their message, but everywhere, somewhere, there's a prominent tagline about "greater productivity". Remember the 4GLs of the 90s?
You look for ways to make software development cheaper. "Hey, maybe if we go find software developers in countries with way lower standards of living, we can pay them less to do the same work." Alternatively, we also saw the rise of "coding bootcamps" that promised to take people with no coding background whatsoever and turn them into software developers in just twelve months--no, nine--no, six--no, no, you fools, overnight. Naturally, it gave rise to scams and lawsuits.
You look for ways to avoid software development entirely. In macroeconomic terms, what substitute goods/services2 can we use to get what we need? In common parlance, this translates to "What if we use something that doesn't require software developers yet still builds software?"
If some of you are old enough to remember CASE tools in the late 80s or the UML tools in the late 90s, those were both plays to avoid software development, though in retrospect it's easy to see how neither one really dodged the basic need, which is "You have to understand the problem deeply enough to be able to figure out how the software should work to solve it."
Vibe coding is, in many respects, the Visual Basic of the 2020s: A tool that's designed to let "anybody" create software (thus neatly avoiding the high cost of software development) and finally slay the Dragon of Infinite Software Demand by allowing supply to run free, no longer burdened by the obstacle of knowing how software works.
But, if you were paying attention two paragraphs back, you already know why vibe coding is failing: "You have to understand the problem deeply enough to be able to figure out how the software should work to solve it." It's not that we don't have tools that can churn out millions of lines of code; the problem is that you need to know how to break the problem down into smaller pieces such that the AI coding tools can actually have a decent chance of generating code that will solve it. (Ironically enough, this is where most software development processes--whether AI-driven or not--run into issues.)
The reason for the prediction? We've had these "vibe coding" tools in our hands for a half-year or more now, and we're already seeing their limits. In 2026, they'll remain, but their use will be limited to building quickie prototypes, and there really isn't a market for tools that can only build quickie prototypes.
... because AI's not gaining much steam anywhere else.
The aforementioned AI startups are constantly trying to sell us on AI in HR, medicine, and finance, but honestly, after a year or so, it doesn't seem to be having much effect, and companies are starting to backtrack and re-trench. For starters, companies tried to use AI agents to do hiring, and that landed... poorly. Things really turned south when companies realized that candidates could do that too, and did. Maybe we can outsource other HR tasks... wait, I think I've heard that one before.
Likewise, AI's use in the legal space has... problems. Sorry, Mike Lindell, you can't just hallucinate your way into proof of the 2020 stolen election! And he's not alone.
Turns out, lots of people get grumpy when AI just makes stuff up out of nothing (even people who are paid to do exactly that), and that trend is only likely to continue. Which then, if you're in the AI business, where can you let the non-deterministic nature of generative AI fly in a space that's open to it, yet structured enough to not be hamstrung by it?
Coding, of course. Programming languages require structure and conformance to form (minimizing the hallucination possibilities), particularly if backed by unit or integration tests, but also require some creativity in translating "problem" to "solution". The coding agents have already had some measure of success--see the aforementioned "vibe coding"--but we're starting to understand their limitations--see the aforementioned "vibe coding". In 2026, that curve is going to pick up steam, and we'll see it get stronger as the year goes by.
... because it captures a large part of the essence of programming.
When Don Syme (he of the F# language) and John Lam (whom I'm lucky to have worked with at DevelopMentor back in the day) both together started talking about "specification-oriented development", it rattled my brain a bit. These are smart dudes, and if they think there's something there, enough to both blog and implement something around it, then I'm curious by default.
"Spec Kit makes your specification the center of your engineering process. Instead of writing a spec and setting it aside, the spec drives the implementation, checklists, and task breakdowns. Your primary role is to steer; the coding agent does the bulk of the writing." On the surface of this, I like where this is going, because elevating the abstractions in programming is something that I've been advocating for a very long time now. (We've been surfing at the "object" level for three decades--what say we start re-thinking the problem again?)
I'm not 100% convinced that this is going to be the "path forward" for our programming, but it's pretty clear that some kind of LLM-driven engagement, whether through coding agents as part of an IDE, or through these "specification" files, is going to markedly color much of what we programmers do for the next few years. I think there's a fair amount of rigor in what's being suggested here, but at the same time, this is flying in the face of some pretty well-established doctrine, and it's there's a pretty high mountain of evidence against the idea of natural language as a deterministic way to write code.
So this looks interesting, but I don't think it's going to take over the world yet. It'll capture some attention, it'll get some revision, it'll maybe spark something else that takes over the world, but somehow, it's going to get some traction over the next year. (What it really needs, though, is a good tutorial/walkthrough on how to use it, both for the simplest "Hello world" case and for a non-trivial example case. Right now the docs on the GitHub repo are a little vague in a lot of places.)
... because coding assistants can carry the weight of syntax better than humans can.
This also means I'm not making any generalized predictions about Java, C#, Python, Swift, or any of the other current crop of languages--there's simply little to no point, at least in 2026. They're still important, don't get me wrong, but their importance is now at more of a strategic level than a tactical one. (Meaning: CTOs and VPs won't care about which version of Java you're using, but will care about whether you're using Java as opposed to C# or Python.)
... because the Fallacies of Distributed Computing still hold, and servers making all these remote calls (over MCP) will slow down model responses, so we'll look for ways to bring them all together under one roof--which means having models small enough to execute locally.
... because also, we're going to start realizing that we don't want "god" agents, because that increases the security risk. We're going to want agents that know how to do one thing and don't try to do anything else.
... because we still desire determinstic behavior out of our programming languages.
This is going to be the natural outcome of (a) more AI coding agent use, (b) SLMs gaining traction, and (c) a strong desire to "smash together" the agent into the IDE and then the language, which we see already in the aforementioned specification-oriented development. Programming languages have the ability to abstract away details we don't want to deal with (a la memory management or pointers), so it stands to reason that some enterprising language designer will figure out how to fold all of that into a text-based input scheme that has just enough rules to allow the agents to shine, while keeping the non-determinism out.
... because the computing power is there, we just need to figure it out
I won't go too deep into the details here, but quantum definitely represents a massive leap forward in computing capability, and while your average video game has no need for it, lots of other applications do (including maybe future LLM models?), and it's too enticing to ignore. Somewhere before the end of the decade, we're going to start seeing developers working against quantum SDKs and using quantum languages, but don't ask me what they'll look like--I'm having a hard enough time understanding the basic science of a qubit to start with!
... because Google wants it to, and mobile devices have long since become powerful enough to be laptop replacements.
You know what the line is between a tablet and a laptop? Me neither.
For years, laptops have tried to stave off the tablets by offering touch-based input (well, all of them except macOS, anyway), to varying degrees of success. Laptops had a few things going for them--larger screen sizes, more ports, an operating system that wasn't hamstrung by only being able to install things through the app stores, and so on. But over the last few years, we've seen those things slowly evaporate (except for the "more ports", which Bluetooh connectivity, wireless protocols, and dongle/hubs are making irrelevant for the most part), to the point now where many people get a larger iPad and a Bluetooth keyboard, and call that a laptop.
The manufacturers feel it; it's why Lenovo and Samsung have tablets that act like laptops. Google, meanwhile, already announced that they want to do away with ChromeOS in favor of Android, and the work over the last year pretty much demonstrates that they're ready to start backing that with laptop or desktop hardware.
Are you going to get Android on a desktop machine? Probably not--a good chunk of the rationale for a desktop purchase these days is gaming, and Android still isn't anywhere close to that world yet, with the other chunk of the rationale being development, which Android again isn't close to. (Although, if I'm being frank, a good keyboard, a good monitor, a command-line prompt, and a half-decent editor, and Android's no worse than many Linux distros at that point.) Either way, it feels like the natural path is for Android to slide into the space that used to be occupied by ChromeOS, and that's through ChromeBooks, e.g., laptops.
... because Glass mode is just... uninspiring.
Seriously. Apple tried really hard to make Apple Intelligence a thing, but in the end, all we're seeing is better and more cameras on the iOS devices, and frankly, yawn. One of their most recent commercials was showing how to use iPhones in cinematography. So we literally have Apple urging us to buy their product by making commercials on... making commercials.
... because they're good enough, and because some humans would rather work against other humans than for humanity.
If "AI actors" are good enough to be considered for use by Hollywood (lookin' at you, Tilly Norwood!), then it takes almost no thought at all to realize that she could very easily show up in a faked Zoom call, FaceTime, or news report. She and her cousins are going to be the source of scams, lies, and even deeper distrust of what we see on the screen.
Ironically, it could be what gets us off those screens, but let's see how it all plays out. Personally I think the growing "digital divide" is going to center on how much one bases their life on their digital footprint, and we could very well see Generation Alpha (or the one that follows it) utterly rejecting some of the lifestyle approach that Millennials and GenZ embrace because of all this. It will be very interesting to watch over the next decade or two.
I have no idea which one, but I'd love for some suggestions and/or requests. I love the shows I hit regularly, but I wouldn't mind going someplace entirely new and talking to people who've never heard me speak before. I'm giving it a slightly lower probability because full-time work always makes this trickier, but it's still a goal.
I know, I know, but we're serious this time. We're in the final proofs-editing stage now, so the question is really whether it'll appear on shelves (figuratively) in February or March.
As always, feel free to comment and tell me why I'm crazy, in whatever forums feel comfortable. In the meantime, talk to you all next year... if not sooner.
Theoretically, anyway. We're starting to see that the cost of owning a virtual machine in the cloud isn't all that much better than it is to own it on-premises, which is leading a number of companies beyond a certain size to reconsider owning and operating their own data centers. It's still way better for smaller companies, where we can benefit from the margins that are gained from a big company running lots of servers, but at a certain size, it just doesn't make sense anymore.
↩https://www.economicshelp.org/blog/glossary/substitute-goods/
↩