01 January 2024

It's that time of the year again, when I make predictions for the upcoming year. As has become my tradition now for nigh-on a decade, I will first go back over last years' predictions, to see how well I called it (and keep me honest), then wax prophetic on what I think the new year has to offer us.

As per previous years, I'm giving myself either a +1 or a -1 based on a purely subjective and highly-biased evaluational criteria as to whether it actually happened (or in some cases at least started to happen before 31 Dec 2023 ended). If you want to skip to the new predictions, scroll down to the next major heading.

2023 was a tumultuous year, for sure--lots of tech layoffs in the early part of the year as big-player after big-player announced whole trunks of the org tree no longer had a job. It definitely created some chaos and consternation among our industry, for sure, and that was all before the collapse of Silicon Valley Bank, which sent a huge ripple through both startup and established player alike.

By the way, as of this writing, despite multiple false starts, I'm still looking for my next great adventure (ideally as an Developer Relations or Engineering leader-of-leaders), so if you find my analysis here to be interesting or intriguing--even if you disagree with it--perhaps there's a role in which I can do this kinds of strategic and executive thinking on your company's behalf? Would love to hear from you.

In 2023...

... I wrote a lot of stuff. (That always seems to happen when I do these.) So let's get to it; as I do each year, I'll include the full text of what I wrote in each bullet point first, then put the Result after it with whatever insights or comments seem relevant. (Arguably none of them are, but hey, it's my set of predictions, so....) As always, evidence is highly opinionated, mostly anecdotal, and entirely from my (and some from my professional network's) perspective, so caveat emptor.

To start, I wrote:

First off, the pandemic hasn't left us, and CDC is starting to talk about masking up again. Despite desperate efforts to pretend we're clear of COVID (and related viruses), we're not, and it's going to continue to rear its head in ugly ways going forward. I don't think we're going back to mandatory stay-at-home policies, mind you, but it's going to be a couple of years of really bad winters, medically speaking. The "taper down" I talked about last year definitely seems to be in full swing (or slope, as the case may be).

Masking is still a thing in much of the Pacific Northwest, and they're still required in most medical offices, but beyond that, COVID or no COVID, people have made their choice. Aside from the performative video piece, nobody is going around ripping masks off of other peoples' faces. Periodically, somebody catches COVID and is down for a week (minimum, usually two), but that seems to be a risk we're willing to accept as a nation.

Secondly, even though the Ukraine conflict continues, and inflation reared its ugly head during the end of 2022, the economy doesn't look as bad as we thought it would six months ago, and I think things are going to "perk up" once we get through the holidays. 2022 sucked, but there's a lot of reasons to think 2023 won't be nearly as bad. (Of course, we thought that about 2022 compared to 2021, and about 2021 compared to 2020, so....)

Statistically, the economy has recovered. Inflation is coming back down, unemployment is actually back to reasonable numbers, and the stock market is doing... better. Nothing is back to its pre-2019 levels, but frankly, we'll probably never be there.

Meanwhile, the war in Ukraine continues, and a new one has opened in Gaza. Regardless of which side of the political spectrum you're on, everything feels terrible, and the last time that happened, Jimmy Carter was President. Are we due for another "Morning in America" moment? Probably not--we're all too cynical to the media for Reagan's watershed speech to work again, but something is going to give soon, and we will all rush to embrace it, because it just doesn't feel as good as it should right now.

Meanwhile, on the tech front....

In summary:

... leaving me with what I consider to be a completely missed round. Yikes. (I am really annoyed at how wrongly I read the room this time last year on the job market--for personal reasons, yes, but also because I really do pride myself on my ability to do these sorts of professional forecasts, and that was a huge misread.)

2024 Predictions

With that settled up, let's take a look at what I think will happen across calendar year 2024. As I've done with my predictions, I'm including a probability score with each one, in much the same way intelligence officials do in their assessment reports, to give you a sense of how confident I am in the prediction.

Hiring will open up. (Probability p 0.8)

The interesting thing about the layoffs is that while they continue to trickle out from various companies all the way to December of 2023, it's a trickle compared to the flood that happened in 1Q, particularly in January. More interestingly, the kinds of people that are going to be hired will shift slightly--right now (end of 2023), companies are focusing on hiring ICs to fit into existing holes in their current org tree. But with the new year, and a bit of optimism in the market, they'll start growing new branches to that tree, and that will require more in the way of engineering management and "ancillary units" like developer relations, R&D, and so on. It'll take a while, and it'll never be like 2019, but before too long things will get stronger.

WebAssembly will gain more traction. (p 0.5)

I haven't really paid much attention to WebAssembly for the past few years, because I wasn't sure where things were going. The 1.0 specification is done, but it's pretty minimal compared to other richer, more full-featured virtual instruction sets (like the JVM or the CLR). Truthfully, it's a ridiculously bare-bones spec. But, even as I say this, there's some interesting infrastructure that's targeting LLVM more and more (most notably LLVM and GraalVM), making it easier and easier for WASM to gently "slide in" to a deeper role within a dev toolchain and do some interesting things. That said, though, WASM still has yet to really demonstrate where it's real value lies--why write something to WASM binary, which generally means targeting the browser, when there's so many Javascript-centric options already? Where's the compelling value-add from the developer perspective? This is the hinge on which WASM's broad-scale applicability matters: If WASM can show how using language-X-compiled-to-WASM enables a new kind of feature or a faster/easier workflow to get to "done", then WASM could very well begin to emerge as the infrastructure backplane for browser-based applications all over the Internet. I don't know what that compelling reason is yet, though, which is why I leave this at a 0.5 p.

Generative AI will lose its luster. (p 0.7)

It already has started, in many ways--memes now are common across the Internet showing ChatGPT conversation windows in which it responds to the question, "How do I use GitHub?" with "In order to push code to the remote server, issue the command git branch..." in a completely erroneous and mistake-riddled answer. The stories of the lawyers who used ChatGPT to write legal briefs that turned out to be filled with inaccuracies and errors--for which they were disbarred, by the way--have cast some serious doubt on the ability of these generative AIs to "fire all the peons" whose work ChatGPT and its ilk were supposedly going to replace.

We will begin to disambiguate between generative AI and large language models (p 0.5)

It happens with AI every decade or so--something new comes along, it takes our imagination by storm, and we start chanting "AI! AI! AI!" like an amped-up-on-booze-and-caffeine-pills crowd at a wrestling match. Then, as we get disenchanted with the results, the various AI firms--who have yoked their success to the success of their AI work--will start to point out that they were never "AI" companies, they were "large language model" companies and therefore never a part of the hype machine that's now augering in.

Custom AI models will begin to gain traction. (p 0.6)

Making use of a large company's model for your natural language analysis needs is great, but over time, companies are going to figure out that the cost of that model will probably be smaller if it can be hosted locally, rather than constantly going to the cloud and back again. (See the economies-of-scale discussion earlier, as well as the performance and reliability implications, a la "The Eight Fallacies of Distributed Computing".) Slowly, quietly, companies with some room in their budgets are going to start looking to develop their own models, and in many cases may find that the model's they're building don't need to be quite so large (and heavy), that in fact a "medium" language model, or even a "small" language model would work, allowing for a local presence on the same node as the rest of the processing. OpenAI and other firms are going to combat this by constantly releasing new models with new excitement and fanfare, but like the cloud itself, the basic path here will be "start with the hosted LLM, then as your success and budget grows, look to build--and tune--your own". It'll be "Buy vs Build" decision-making, all over again.

New and interesting languages will begin to make waves. (p 0.6)

I've been saying this for years, but I keep holding out hope. Frankly, I think some of the development will be to raise the abstraction levels of what we do, because ChatGPT's success at writing code is due to the fact that 95 times out of 100, we're writing the same basic CRUD crap over and over and over again. The Pareto Principle holds here: 80% of all mobile applications (as an example) are doing the same stuff, it's just that it's a different 80% from all the rest. A language/stack that can embrace an onion-style architectural approach (allowing for "trap doors" to drop down a level of abstraction when/if necessary, such as how C/C++ allowed for inline assembly way back in the day) will be the answer here.

The other element I imagine will/could be interesting to explore will be the intersection of what natural language models and compiler front-ends will look like--if we can get a language to start looking more like a natural language, it might enable some interesting DSL-type scenarios for end-users, and reduce the demands/pressures on developers to respond to every little change.

Cracks in the "full-stack developer" facade will grow. (p 0.7)

The demand for the "full-stack developer" is mostly a cop-out on the part of managers, who don't want to artificially restrict their employee search because "hiring is hard". So, instead of carefully considering what skills the current team lacks that needs shoring up, they instead scribble "full stack developer" on the job description, and proceed to rattle off every single technology that the team has ever used.

AR/VR will start showing glimmers of life (p 0.5)

If AR/VR is going to achieve anything of any substantial value, it really needs to do so soon, before it is relegated to the "Interesting Things That Were Out Of Their Time" bin, like the Apple Newton and the Xerox PARC. This year might--might--be the year it presents us with something, particularly given the rise in interest in wearables. Certainly, with Meta going all-in on the whole "Metaverse" thing (whoever came up with that should be fired), there's going to be no shortage of money thrown at the problem, the big question is, will there be any actual innovation around UI? Because so long as the AR/VR thing is solely game-centric, it's fairly safe to assume that the AR/VR will go right into that bin without too much hesitation. I know, games are huge industry (measured in the trillions now?), but it's not enough to support hardware; witness the difficulties even conventional gaming hardware (for example, joysticks--today's flight sims are vastly more complicated than the ones twenty years ago, yet nary a joystick to be found) has had over the decades. If AR/VR (which requires hardware specific to it) is going to reach a point that justifies buying specific hardware for it, it has to be for something more than just gaming (and please don't say "education", because if there's one place in the world that has almost no budget, it's education).

"Car apps" will begin to become "a thing" (p 0.6)

More and more vehicles out there are coming with computers in the console, offering easy connection to your mobile device. What's been lacking has been any sort of apps for it, beyond the basics (maps, audio entertainment, etc). Now that there's more of a target market, perhaps we are reaching a critical mass of potential customers to justify the investment in building apps specificall for vehicles.

If this does start to take hold, it will be interesting to see what sorts of apps get built (I could imagine some CRM apps for certain kinds of salespeople, for example) and what form user interaction takes hold (voice control and interaction would be very important for drivers, for example). Frankly, though, the hard part will be the core innovation itself--what sort of apps do we want when we're driving?

Kotlin will remain an "Android-only" language (p 0.5)

This one is a hard one to call, but I'm flipping a coin and landing on the pessimistic side of the "Will Kotlin break out of Android-only?" question. For those who weren't following the Kotlin story at home, Kotlin has recently done a couple of things to make it more than just "that language you use for Android" by developing Kotlin Multiplatform and Kotlin Native. These open up the Kotlin language for use in more situations, but their success will really hinge on how many developers actually want to use Kotlin in more places than just their Android codebase.

Historically, multiplatform languages have not done well, with the sole (arguable) success being that of Java--whose "Write once, run anywhere" campaign didn't really accomplish much. (Remember, most of Java's success was in the server room, not the desktop.) Native might have more possibility for success, but either one gaining any traction would be an interesting development and potentially grow Kotlin beyond just "an Android thing".

C# and Java will continue to "pile on" with features (p 0.8)

Both languages have reached a point where the weight of the language is really beyond the ability of any one person to keep track of what's there, yet each time a new release comes out, it comes with a whole slew of new proposed features designed to bring new syntax into an already-existing part of the language or an attempt to cram yet-another-interesting-idea into an already "overloaded-collection-of-features"-oriented language. (Can you really look at either C# or Java and tell me that they are "object"-oriented anymore? You can go quite a ways with either and never see or hear an object anywhere along the way by this point.)

Look, maybe this is my "Gitoffamahlawn" moment to these predictions, but both of these languages are old enough to drink, they each spun up powerful and resilient platforms that have a number of newer, simpler, more-concise languages, and at a certain point in time, it's reaasonable to say, "This one is done. There's nothing new we need to add." To keep cramming new stuff in is a terrible addiction--it's a way to show that "We're still hip! We're still doing cool stuff (...and therefore justify why companies should still be paying us license fees!)" We keep telling startups not to do it, but our two major providers of software development tools can't seem to get the lesson themselves.

Spend some time with F#. With Clojure. With Kotlin. Play with some of the new ideas (Mint, Wasp, Io, Alloy). Or even go back and experiment with some of the classics (Smalltalk, Self, LISP, Scheme). But, for your own sake, stop enabling the relentless pounding of the feature surf on the beach of your brain by breathlessly hanging on "What comes next", because in time, you'll be battered into sand.

Crypto falls even further. Blockchain struggles to reinvent. (p 0.7)

Look, faith in cryptocurrency is really at an all-time low, except for the people who are caught holding the bottom of the Ponzi pyramid, and they desperately want to hype the hell out of it so they can pass the bag on to the next level down. That won't change. Blockchain, the world's worst-designed distributed database, will continue to wrestle with the goal of finding a reason to exist, and hey, could maybe even find one bullseye, given enough darts. (Many companies that invested in Blockchain have either decided to walk away from the dartboard or else that they're going to load up on a lot of darts--and maybe a shotgun or two--until they hit that bullseye.)

I really would love for these two to just go away entirely, but they won't, because nothing ever dies completely. But this is the last year they're going to get much--if any--serious time in the tech press, despite the fevered efforts of a precious few on LinkedIn. (Yes, Andreeson-Horowitz is going big into it; yes, they've been big into it since it's start; yes, there will always be people who see what they want to see, instead of following actual evidence; and yes, folks, there's still time to dive in and turn your profit, just don't be the last one holding the pyramid up!)

And, for me, 2024 will be the last year I talk about this in any form.

Phishing attacks go ballistic (p 0.7)

Thanks, AI! With all these generative natural language tools (like ChatGPT), it will be easier than ever to generate emails intended to trick users into surrendering their credentials to fake sites. That will lead to more breaches, more private info leaks, and more of all the wonderful things that come with those.

What will make it even worse will be all the security companies that roll out "AI-based" security tools, claiming that the AI will somehow be able to better protect against those very same attacks--but while generative AI can create really good-looking human-facsimile output, it's not always great at recognizing artificially-created human-facsimile input. Which means the attackers just got a boost to their toolset, and the defenders will be looking to try and keep up this year.

Databases will incrementally improve (p 0.7)

... as opposed to releasing something drastic. There is some interesting ideas percolating over in the world of DBOS, spearheaded by one of the principals in the Postgres world, but I have a feeling that it'll take a little bit for those ideas to brew before there's really something there to evaluate. (Hope I'm wrong, though.)

Meanwhile, though, players in the RDBMS world will slip a new feature into the mix by polluting the relational model just that wee bit more, the NoSQL players will pick just a tiny bit more from the relational world and add that, and developers will continue to choose which player they want to play with based on criteria that touches maybe 1% of the feature set of the player they choose.

I'm publishing a book (p 0.8)

I and a few other folks are working on a book based on the Developer Relations Activity Patterns catalog that I drafted a ways back, expanding on the material there in a big way. That'll come out in 2024, almost guaranteed.

Fifteen predictions for the next twelve months. Let's see how I do this time.

Talk to you all next year... if not sooner.


Tags: predictions  

Last modified 01 January 2024