06 January 2025

It's that time of the year again, when I make predictions for the upcoming year. As has become my tradition now for almost two decades, I will first go back over last years' predictions, to see how well I called it (and keep me honest), then wax prophetic on what I think the new year has to offer us.

As per previous years, I'm giving myself either a +1 or a -1 based on a purely subjective and highly-biased evaluational criteria as to whether it actually happened (or in some cases at least started to happen before 31 Dec 2024 ended). If you want to skip to the new predictions, scroll down to the next major heading.

In the large, 2024 was... ugh. We thought 2023 was tumultuous, but then 2024 said, "Hold my beer." The job market for the tech sector started to look up, then started to look terrible again, then people said, "No, it's really getting better" even as others said "No this is the worst it's ever been" and most of the rest of us just threw our hands up in the air and stopped trying to guess. Anecdotally, I keep seeing lots of people on LinkedIN on their last months' reserves, but at the same time LinkedIN keeps sending me all sorts of interesting openings that, when contacted, yield nothing but whispers. I kinda gave up trying to figure out where we are--though I'm still looking for something full-time myself. (If you find my analysis here to be interesting or intriguing--even if you disagree with it--perhaps there's a role in which I can do this kinds of strategic and executive thinking on your company's behalf? Would love to hear from you.)

In 2024...

... I wrote a lot of stuff. (That always seems to happen when I do these.) So let's get to it; as I do each year, I'll include the full text of what I wrote in each bullet point first, then put the Result after it with whatever insights or comments seem relevant. (Arguably none of them are, but hey, it's my set of predictions, so....) As always, evidence is highly opinionated, mostly anecdotal, and entirely from my (and some from my professional network's) perspective, so caveat emptor.

Not surprisingly, a lot of what I wrote was about "AI". (I put the term in quotes both because I have no illusions about machines becoming actually intelligent, nor do I think the current area of research into Large Language Models and generative AI are the sum total of AI research. Where's my "fuzzy logic" from the 1990s, people?) And, by the way, I guarantee that AI didn't write a word of this post, or any other. (When I use the word "delve", it's because I damn well wanted to!)

Hiring will open up. (Probability p 0.8)

"The interesting thing about the layoffs is that while they continue to trickle out from various companies all the way to December of 2023, it's a trickle compared to the flood that happened in 1Q, particularly in January. More interestingly, the kinds of people that are going to be hired will shift slightly--right now (end of 2023), companies are focusing on hiring ICs to fit into existing holes in their current org tree. But with the new year, and a bit of optimism in the market, they'll start growing new branches to that tree, and that will require more in the way of engineering management and "ancillary units" like developer relations, R&D, and so on. It'll take a while, and it'll never be like 2019, but before too long things will get stronger."

Result Well.... I really don't quite know how to score this one. On the one hand, some statistics suggest that hiring did open up ("According to the Bureau of Labor Statistics, 829,000 jobs were added in Q1 2024"), but obviously lots of people aren't feeling it. The thought process here is that yes, the jobs started to come back, but it turns out that there's a lot of people still looking, and so the jobs are disappearing quickly, either to internal candidates, overqualified candidates, or even never being likely to hire in the first place. I'm thinking this is a -1 to my predictions score, even if technically I might have been right.

WebAssembly will gain more traction. (p 0.5)

"I haven't really paid much attention to WebAssembly for the past few years, because I wasn't sure where things were going. The 1.0 specification is done, but it's pretty minimal compared to other richer, more full-featured virtual instruction sets (like the JVM or the CLR). Truthfully, it's a ridiculously bare-bones spec. But, even as I say this, there's some interesting infrastructure that's targeting LLVM more and more (most notably LLVM and GraalVM), making it easier and easier for WASM to gently "slide in" to a deeper role within a dev toolchain and do some interesting things. That said, though, WASM still has yet to really demonstrate where it's real value lies--why write something to WASM binary, which generally means targeting the browser, when there's so many Javascript-centric options already? Where's the compelling value-add from the developer perspective? This is the hinge on which WASM's broad-scale applicability matters: If WASM can show how using language-X-compiled-to-WASM enables a new kind of feature or a faster/easier workflow to get to "done", then WASM could very well begin to emerge as the infrastructure backplane for browser-based applications all over the Internet. I don't know what that compelling reason is yet, though, which is why I leave this at a 0.5 p."

Result Well.... Yeah, no, WebAssembly remains something of a niche space here at the start of 2025. It's almost ironic: You know how everybody in the Linux world keeps saying, "THIS is the year for Linux on the desktop!"? Well, I think they're about to be challenged by the WebAssembly crowd for the title of "Most Consistent Prediction that Never Comes True". There's some interesting technology here, but WASM just can't seem to catch a "hook" to catapult it to catapult it out of "Interesting-Yet-Uncompelling" territory it seems mired in. Which, we will note, I pointed out: "WASM still has yet to really demonstrate where it's real value lies--why write something to WASM binary, which generally means targeting the browser, when there's so many Javascript-centric options already? Where's the compelling value-add from the developer perspective?" Still, it's a -1 to my prediction.

Generative AI will lose its luster. (p 0.7)

"It already has started, in many ways--memes now are common across the Internet showing ChatGPT conversation windows in which it responds to the question, "How do I use GitHub?" with "In order to push code to the remote server, issue the command git branch..." in a completely erroneous and mistake-riddled answer. The stories of the lawyers who used ChatGPT to write legal briefs that turned out to be filled with inaccuracies and errors--for which they were disbarred, by the way--have cast some serious doubt on the ability of these generative AIs to "fire all the peons" whose work ChatGPT and its ilk were supposedly going to replace."

Result Well.... The tech world is pretty clearly dividing into the "anti-AI" (aka "AI haters") and "pro-AI" (aka "venture capitalists") crowds, and frankly the more we see playing out in front of us, the more it looks like generative AI has reached its limits. (When OpenAI says they need more money than has ever been produced in the history of mankind in order to progress, you kinda get to draw that conclusion.) But then we also had something of a Hollywood scandal drop, and suddenly it was mainstream news to talk about how AIs need to train off of existing data. ScarJo's lawsuit remains unresolved (as far as I know) at this point, but it definitely sent a ripple through a lot of the entertainment industry. I'm going to give myself a +1 for this one, because when it makes the 5o'clock news, and not in a good way, it's "lost its luster". (She wasn't alone, by the way--others, like John Grisham and George R. R. Martin, filed a class-action lawsuit as well. This is definitely not "finished" discussion.)

We will begin to disambiguate between generative AI and large language models (p 0.5)

"It happens with AI every decade or so--something new comes along, it takes our imagination by storm, and we start chanting "AI! AI! AI!" like an amped-up-on-booze-and-caffeine-pills crowd at a wrestling match. Then, as we get disenchanted with the results, the various AI firms--who have yoked their success to the success of their AI work--will start to point out that they were never "AI" companies, they were "large language model" companies and therefore never a part of the hype machine that's now augering in."

Result It's very faint, but you can start to hear it in some of the tech punditry. There's some interesting things going on with generative AI (one of my D&D gaming buddies discovered https://suno.com and has started creating ballads and epic dirges about our campaign), and we're not talking about language at all. I think, realistically, this scores at a 0 but it's definitely going to get more differentiated in 2025.

Custom AI models will begin to gain traction. (p 0.6)

"Making use of a large company's model for your natural language analysis needs is great, but over time, companies are going to figure out that the cost of that model will probably be smaller if it can be hosted locally, rather than constantly going to the cloud and back again. (See the economies-of-scale discussion earlier, as well as the performance and reliability implications, a la "The Eight Fallacies of Distributed Computing".) Slowly, quietly, companies with some room in their budgets are going to start looking to develop their own models, and in many cases may find that the model's they're building don't need to be quite so large (and heavy), that in fact a "medium" language model, or even a "small" language model would work, allowing for a local presence on the same node as the rest of the processing. OpenAI and other firms are going to combat this by constantly releasing new models with new excitement and fanfare, but like the cloud itself, the basic path here will be "start with the hosted LLM, then as your success and budget grows, look to build--and tune--your own". It'll be "Buy vs Build" decision-making, all over again."

Result The "quietly" part of that prediction is what trips it up--if companies are starting to build out their own models, they're not talking. Perhaps the best thing at this point is to punt on it for a year, and give myself a 0 and say, "Let's see where it is in 2025."

New and interesting languages will begin to make waves. (p 0.6)

"I've been saying this for years, but I keep holding out hope. Frankly, I think some of the development will be to raise the abstraction levels of what we do, because ChatGPT's success at writing code is due to the fact that 95 times out of 100, we're writing the same basic CRUD crap over and over and over again. The Pareto Principle holds here: 80% of all mobile applications (as an example) are doing the same stuff, it's just that it's a different 80% from all the rest. A language/stack that can embrace an onion-style architectural approach (allowing for "trap doors" to drop down a level of abstraction when/if necessary, such as how C/C++ allowed for inline assembly way back in the day) will be the answer here.

"The other element I imagine will/could be interesting to explore will be the intersection of what natural language models and compiler front-ends will look like--if we can get a language to start looking more like a natural language, it might enable some interesting DSL-type scenarios for end-users, and reduce the demands/pressures on developers to respond to every little change."

Result Hello, Wing. Hello, Wasp. You're not quite what I was expecting or anticipating, but you're starting to get some interesting traction, and you're certainly not alone. You're also not natural-language-based like I'd anticipated (or hoped!), but languages take a while to bake. I personally have some ideas on how this should look, but mine is but a small whisper of an idea, and I expect there will be vastly brighter people than me going into this space. Still, I'm feeling generous to myself: +1.

Cracks in the "full-stack developer" facade will grow. (p 0.7)

"The demand for the "full-stack developer" is mostly a cop-out on the part of managers, who don't want to artificially restrict their employee search because "hiring is hard". So, instead of carefully considering what skills the current team lacks that needs shoring up, they instead scribble "full stack developer" on the job description, and proceed to rattle off every single technology that the team has ever used."

Result Some folks are definitely not seeing it. Of course, many of those folks are tech-training companies, whose business depends on developers (or their managers) feeling inadequate and needing training. Others, however, are pointing out the obvious flaws in the approach, that "FSDs" can't really scale out as well as specialists can. Still, it seems that the recruiting posts and hiring managers still have "FSD on the Brain" syndrome, despite the clear drawbacks to trying to hire jacks-of-all-trades, so I have to give myself a -1 here.

AR/VR will start showing glimmers of life (p 0.5)

"If AR/VR is going to achieve anything of any substantial value, it really needs to do so soon, before it is relegated to the "Interesting Things That Were Out Of Their Time" bin, like the Apple Newton and the Xerox PARC. This year might--might--be the year it presents us with something, particularly given the rise in interest in wearables. Certainly, with Meta going all-in on the whole "Metaverse" thing (whoever came up with that should be fired), there's going to be no shortage of money thrown at the problem, the big question is, will there be any actual innovation around UI? Because so long as the AR/VR thing is solely game-centric, it's fairly safe to assume that the AR/VR will go right into that bin without too much hesitation. I know, games are huge industry (measured in the trillions now?), but it's not enough to support hardware; witness the difficulties even conventional gaming hardware (for example, joysticks--today's flight sims are vastly more complicated than the ones twenty years ago, yet nary a joystick to be found) has had over the decades. If AR/VR (which requires hardware specific to it) is going to reach a point that justifies buying specific hardware for it, it has to be for something more than just gaming (and please don't say "education", because if there's one place in the world that has almost no budget, it's education)."

Result Oh, Ted. Sometimes you just have to stick to your guns on a theory, and not try to offer a tech a way out of its fate. With Apple basically shutting down the Vision Pro, and Microsoft already out of the HoloLens business, it seems like AR/VR is probably "done" for the forseeable future. Meta is still a player here, though, and they're large enough to continue to throw good money after bad, so there's still hope, I suppose, that Oculus will suddenly make a splash... but I doubt it. -1

"Car apps" will begin to become "a thing" (p 0.6)

"More and more vehicles out there are coming with computers in the console, offering easy connection to your mobile device. What's been lacking has been any sort of apps for it, beyond the basics (maps, audio entertainment, etc). Now that there's more of a target market, perhaps we are reaching a critical mass of potential customers to justify the investment in building apps specificall for vehicles."

"If this does start to take hold, it will be interesting to see what sorts of apps get built (I could imagine some CRM apps for certain kinds of salespeople, for example) and what form user interaction takes hold (voice control and interaction would be very important for drivers, for example). Frankly, though, the hard part will be the core innovation itself--what sort of apps do we want when we're driving?"

Result "Car apps"? What was I thinking? -1

Kotlin will remain an "Android-only" language (p 0.5)

"This one is a hard one to call, but I'm flipping a coin and landing on the pessimistic side of the "Will Kotlin break out of Android-only?" question. For those who weren't following the Kotlin story at home, Kotlin has recently done a couple of things to make it more than just "that language you use for Android" by developing Kotlin Multiplatform and Kotlin Native. These open up the Kotlin language for use in more situations, but their success will really hinge on how many developers actually want to use Kotlin in more places than just their Android codebase.

"Historically, multiplatform languages have not done well, with the sole (arguable) success being that of Java--whose "Write once, run anywhere" campaign didn't really accomplish much. (Remember, most of Java's success was in the server room, not the desktop.) Native might have more possibility for success, but either one gaining any traction would be an interesting development and potentially grow Kotlin beyond just "an Android thing"."

Result Kotlin remains firmly in the Android space, near as I can tell, and nobody seems all that interested in Kotlin Multiplatform. -1 It's certainly possible that the interest can build, but if there's not much of a splash at its introduction, it's hard to build later (without some kind of compelling shift in the market, anyway). Remember, the only reason we're talking about Kotlin today is because Google adopted it as its language of choice for Android (in order to get clear of Java, some might suggest).

C# and Java will continue to "pile on" with features (p 0.8)

"Both languages have reached a point where the weight of the language is really beyond the ability of any one person to keep track of what's there, yet each time a new release comes out, it comes with a whole slew of new proposed features designed to bring new syntax into an already-existing part of the language or an attempt to cram yet-another-interesting-idea into an already "overloaded-collection-of-features"-oriented language. (Can you really look at either C# or Java and tell me that they are "object"-oriented anymore? You can go quite a ways with either and never see or hear an object anywhere along the way by this point.)

"Look, maybe this is my "Gitoffamahlawn" moment to these predictions, but both of these languages are old enough to drink, they each spun up powerful and resilient platforms that have a number of newer, simpler, more-concise languages, and at a certain point in time, it's reaasonable to say, "This one is done. There's nothing new we need to add." To keep cramming new stuff in is a terrible addiction--it's a way to show that "We're still hip! We're still doing cool stuff (...and therefore justify why companies should still be paying us license fees!)" We keep telling startups not to do it, but our two major providers of software development tools can't seem to get the lesson themselves.

"Spend some time with F#. With Clojure. With Kotlin. Play with some of the new ideas (Mint, Io, Alloy). Or even go back and experiment with some of the classics (Smalltalk, Self, LISP, Scheme). But, for your own sake, stop enabling the relentless pounding of the feature surf on the beach of your brain by breathlessly hanging on "What comes next", because in time, you'll be battered into sand."

Result Yeah, lawn, off, get. +1

Crypto falls even further. Blockchain struggles to reinvent. (p 0.7)

"Look, faith in cryptocurrency is really at an all-time low, except for the people who are caught holding the bottom of the Ponzi pyramid, and they desperately want to hype the hell out of it so they can pass the bag on to the next level down. That won't change. Blockchain, the world's worst-designed distributed database, will continue to wrestle with the goal of finding a reason to exist, and hey, could maybe even find one bullseye, given enough darts. (Many companies that invested in Blockchain have either decided to walk away from the dartboard or else that they're going to load up on a lot of darts--and maybe a shotgun or two--until they hit that bullseye.)

"I really would love for these two to just go away entirely, but they won't, because nothing ever dies completely. But this is the last year they're going to get much--if any--serious time in the tech press, despite the fevered efforts of a precious few on LinkedIn. (Yes, Andreeson-Horowitz is going big into it; yes, they've been big into it since it's start; yes, there will always be people who see what they want to see, instead of following actual evidence; and yes, folks, there's still time to dive in and turn your profit, just don't be the last one holding the pyramid up!)

"And, for me, 2024 will be the last year I talk about this in any form."

Result Well, this was well on the way to a +1... and then Trump decided to breathe new life into the whole mess by starting to tease greater legitimacy for cryptocurrencies by declaring a "strategic bitcoin reserve". That's got the whole space chittering with excitement, and naturally the markets went haywire. This seems like a good idea? Sadly, I fear even my "this will be the last year I talk about this in any form" prediction can't even hold true. -1

Phishing attacks go ballistic (p 0.7)

"Thanks, AI! With all these generative natural language tools (like ChatGPT), it will be easier than ever to generate emails intended to trick users into surrendering their credentials to fake sites. That will lead to more breaches, more private info leaks, and more of all the wonderful things that come with those.

"What will make it even worse will be all the security companies that roll out "AI-based" security tools, claiming that the AI will somehow be able to better protect against those very same attacks--but while generative AI can create really good-looking human-facsimile output, it's not always great at recognizing artificially-created human-facsimile input. Which means the attackers just got a boost to their toolset, and the defenders will be looking to try and keep up this year."

Result Turns out, it's not your kid in jail needing bail money, after all. Fortunately, AI can also fight the scammers, by boring them to death! JFC. +1, but I hate myself a little bit over all this.

Databases will incrementally improve (p 0.7)

"... as opposed to releasing something drastic. There is some interesting ideas percolating over in the world of DBOS, spearheaded by one of the principals in the Postgres world, but I have a feeling that it'll take a little bit for those ideas to brew before there's really something there to evaluate. (Hope I'm wrong, though.)

"Meanwhile, though, players in the RDBMS world will slip a new feature into the mix by polluting the relational model just that wee bit more, the NoSQL players will pick just a tiny bit more from the relational world and add that, and developers will continue to choose which player they want to play with based on criteria that touches maybe 1% of the feature set of the player they choose."

Result DBOS continues to churn along, and overall, relational databases continue to introduce interesting ideas into SQL and get further away from the relational purity Codd and Date and the others sought back in the 70s. JSON is now a fixture in most SQL dialects, and it wouldn't surprise me (or anybody) if an LLM shows up in there somewhere, maybe shoehorned between the XML-generation code and Java bindings. (You did read parts 14 and 13 of the SQL specification, didn't you?) +1

I'm publishing a book (p 0.8)

"I and a few other folks are working on a book based on the Developer Relations Activity Patterns catalog that I drafted a ways back, expanding on the material there in a big way. That'll come out in 2024, almost guaranteed."

Result +1... almost. It's almost done, so it'll come out, just not quite in 2024.

Final tally

... which puts me squarely in the middle of the bell curve. Good thing this isn't my day job!

So now let's turn our head to 2025.

As I do every year, each prediction comes with a probability factor (p) that ranges anywhere from 0.0 to 1.0, with most predictions appearing in the 0.3 to 0.8 range. This is a trick I picked up from my International Relations days, where political prediction briefings often came with a similar kind of probability as a way of offering a "confidence" factor. Anything higher than a 0.8 means "We're pretty damn sure", and anything less than 0.3 means "We're including it mostly out of completeness but we don't expect it."

The 2025 Tech Predictions are going to be wild (p 0.8)

Sometimes as part of writing these predictions, I take a look around the Internet (thanks Google!) with what other people are predicting, and let me tell you, the predictions for this upcoming year are off the hook. Autonomous robots. Human-computer brain augmentation. AI-generated AI creating new AI. If you ever saw it in a sci-fi movie, somebody out here on the Internet is predicting "this is the year"! (BTW, did you know that this year is going to be the year of Linux on the desktop? It's true! I read it on the Internet!) And the corresponding benefits are even more wild: Humans will be replaced by the singularity. No, humans will be able to all of us retire thanks to the singularity. No, humans are the singularity. More than any year, 2025 really seems to be catching fire in peoples' imaginations, and they're letting their tech-freak flag fly.

"Developers are obsolete" prose will peak, then dwingle (p 0.8)

Look, let's be honest: Lots of people have a vested interest in seeing software development's "hold" over the creation of software broken. Startup founders desperately want to believe they don't need a CTO or any of those really expensive "full-stack" people to build out their idea. Non-FAANG tech-adjacent companies look at their labor costs and think, "Do we really need to have all these expensive people on our payroll?" AI venture capitalists (who want to see their investment make fortune quickly, so they can bail and reap the high stock price) want people to buy into the idea that AI can replace... pretty much anybody... without waiting for actual evidence of the success of replacing humans with AI. It's all going to hit a fever pitch this year, as the "pro-AI" crowd tries to drown out the growing surge of "anti-AI" evidence, then the bubble will burst, the AI startups will start to collapse, and the VCs will start hunting up their next hype to invest-pump-dump.

AI startups are going to start failing (p 0.5)

It may be this year, or it may be next year, but the AI venture-capital-gravy-train is about to run out. Fully 25% of all VC money went into "something something AI", and it's eerily reminiscient of the "dot-com" years, when a huge percentage of all investment and VC money went into "something something Internet" and gave us such wonderful profit-avoidant monstrosities like "pets.com". (Seriously. The sock puppet. It's all that's left of what was one of the most trumpeted startups in the early 2000s.) But eventually, your business has to actually make money (even you, Sam Altman), and when the profits aren't there, the shutdowns follow.

ARM is going to start eating into Intel in a huge way (p 0.7)

In the beginning, there was the x86. And Intel rode that line into massive, generational-spanning profits. But slowly, over time, the ARM chip architecture has been slowly but steadily eroding the bulwark that x86/x64 erected ("You'd have to recompile everything! Nothing would run out of the box! Dogs and cats will merge to form a master race that will enslave humanity!"), such that by the end of 2024, not was any shipping device without a built-in keyboard running ARM internally, Microsoft had a version of Windows running on the ARM chip. This upcoming year, there's literally no barriers left to ARM's complete dominance of the CPU market. The "Apple Silicon" MacBooks were just the vanguard in the horde of new ARM-based laptops that are going to come out and essentially take over. (As a caveat, if you're a native-language developer, as in you use C, C++, Swift, Rust, Go, Nim, Zig, and so on--if you've not started learning what ARM assembly looks like and behaves, now's a good time to start learning it.)

Java and C# aren't going to introduce anything significant (p 0.7)

Look, I've been saying this for a few years now, and I'm going to make this the last year I do so, but these languages are so mainstream right now that nobody (except for conference speakers, who constantly mine the feature lists for "What's New" talks for the coming year) really (a) pays much attention to their "v.Next" featureset, or (b) cares. The price of mainstream status is a desire for "sustainability", which means most Java shops are not moving versions unless it's an "LTS" release, and most .NET shops are in the same mindset. (BTW, most iOS shops have lost track of all the new "bright shiny"s that Swift keeps getting, and what's worse, each release seems entirely incompatible with the previous.)

TypeScript jumps the shark (p 0.6)

The TypeScript language started life as an attempt to add a strongly-typed type system to the JavaScript ecosystem, in order to provide greater and greater up-front verification of code in order to get better errors earlier. (That's the general goal of any strongly-typed language, and while we can debate whether that's a good idea or not, what's not debatable is that a significant number of programmers buy into it, one of them being Anders.) But TypeScript also discovered that it could do a lot of type-level things, and TypeScript has gotten more and more complex and complicated and confusing with each release. (I've commented, elsewhere, that I lost faith in the language when the team chose not to create a language spec for it anymore, choosing instead to just say, "The code is the spec" and publish a new blog post with each release desribing new features in a casual and imprecise way.) This is the year (though it could be next year, although some have said it was already last year) that TypeScript starts compiling (or not compiling) expressions that absolutely confuse the hell out of anybody without a PhD in type theory. (For whatever it's worth, two years ago somebody put together a list of 'WTF moments', and they're... interesting.)

Python type hinting goes mainstream (p 0.6)

Quietly, while nobody was looking for it, Python introduced "type hinting", where Python code can be given Pascal-style type hints to parameters and return types, allowing tools like IDEs and editors to be able to provide better edit-time support. Most of the Python tutorials, it seems, ignored this for a while, but starting last year it seems the notion of writing type-hinted Python is becoming more and more acceptable to the Python community, though it seems that the majority of the community kept to "untyped" Python. My guess is, this is the year that the balance shifts, and "typed" Python begins to outweight the "untyped" Python. Legacy "untyped" codebases will of course exist for centuries to come, but I suspect that this coming year might be when companies start scouring their Python codebase and thinking, "Maybe we should put type hints in...?".

(BTW, Pythonistas, you really need to read--nay, study--the PEP in detail. It's massive.)

JetBrains tries very hard to push Kotlin Multiplatform (p 0.7)

JetBrains put a ton of time and energy into KMP, and thus far it looks like there's been absolutely zero uptake from anyone outside of the building. If JetBrains is going to be able to call this any kind of success, they're going to need 2025 to be the year that KMP emerged as a credible competitor to React Native or Flutter.

Flutter begins its quiet slide into obscurity (p 0.7)

Speaking of which, Flutter's going to start being quiesced by Google. First there was the layoffs (200-some people in October of last year), which signaled pretty strongly that Google's "done" with Flutter, all the PR statements notwithstanding. Then the community fork that came out of that news led a number of folks to think, "Wow, Google's really done with Flutter", since nobody would think to create a community fork of something that's being actively developed and maintained by the multi-trillion-dollar entity that created it. That makes the companies who make money on the things developers build (as opposed to those companies who make money on building things for developers) get very nervous about adopting or continuing to invest in Flutter, and we can expect Flutter's usage numbers and interest level to plummet as a result. Yes, the open-source tools are there, and yes, people will continue to work on them, but hey by the way, how's the Parse community looking these days? Or LoopBack?

Swift will release a new version, and it will have incompatibilities (p 0.8)

Like the rain in Seattle, Apple always releases a new version of Swift every year, and every year, there's something in there that's not backwards-compatible with the previous version. There is zero reason to expect that 2025 will be any different on that score.

Rust will not replace C++ (p 0.7)

Oh, I know the downsides to C++. I really do. I programmed in the language for a half-decade (back in the "bad ol' days of pre-C++08, waaay back in the late 90s), so I'm well aware of all the dangers of pointers and what-not. But Rust isn't a high-level language, and there's way too much C and C++ code out there for companies to credibly consider rewriting all that code. We can ache and moan about computer security and stability all we like, but doing that rewrite is not a simple line-for-line replacement, precisely because of the pointer guarantees that Rust enforces--all of those pointers have to be debated, decided, and declared, and it has to be consistent across multi-million-line codebases. Rust may make some inroads in getting new system-level development, but it's not going to make C++ "go away" any time soon, not this year or the next (or the one after that, or even the one after that).

Odin, Nim and Zig will gain some attention (p 0.5)

All of these are system-level languages in the tradition of C++ or Rust, and all of them are interesting in their own rights. I suspect some of the enthusiasm for Rust will get siphoned off into these, and before long the "Rust will replace C++" crowd will turn into the "Rust/Odin/Nim/Zig will replace C++" crowd, before they start fighting among themselves which of those four will do the replacing (because, of course, there can only be one, right?).

DIY databases will start gathering attention (p 0.5)

There's been a quiet surge of interest in SQLite recently, since it ships pretty much everywhere (it's already on every mobile device on the planet, and its C codebase and bindings make it super-easy to FFI out to if there aren't already language-level bindings for your chosen language) and it's a pretty feature-complete subset of SQL-92. That's gotten a number of folks intrigued with the idea of "local database"-based functionality, which in turn has sparked some interest in "Wait, what exactly makes up a database, again...?" because if I can get the storage engine and bypass the SQL-ish query and talk directly to the query engine and.... Well, suddenly people are starting to wonder if the database has to be the black box we've always assumed it to be. I think in 2025 we're going to start cracking it apart and doing a little plug-and-play of the components as part of, or behind, their service interfaces.

Our book will ship! (p 0.9)

Seriously! It will! I promise!

Maybe I'll have a full-time gig again? (p 0.5)

Three years and counting. I wouldn't mind the "enforced retirement" so much, were it not for all these people coming around my place, telling me I owe them money.... Fortunately I keep finding the odd thing to do every so often, but it'd be nice to not have to answer questions that start with "Tell me of a time when....".

I'll start either a YouTube channel or some video training (p 0.4)

The probability goes up if I don't have something full-time landed by the end of January.

I'll speak at a brand-new (to me) conference (p 0.5)

I have no idea which one, but I'd love for some suggestions and/or requests. I love the shows I hit regularly, but I wouldn't mind going someplace entirely new and talking to people who've never heard me speak before.

As always, feel free to comment and tell me why I'm crazy, in whatever forums feel comfortable. In the meantime, talk to you all next year... if not sooner.


Tags: predictions  

Last modified 06 January 2025