01 January 2024
It's that time of the year again, when I make predictions for the upcoming year. As has become my tradition now for nigh-on a decade, I will first go back over last years' predictions, to see how well I called it (and keep me honest), then wax prophetic on what I think the new year has to offer us.
As per previous years, I'm giving myself either a +1 or a -1 based on a purely subjective and highly-biased evaluational criteria as to whether it actually happened (or in some cases at least started to happen before 31 Dec 2023 ended). If you want to skip to the new predictions, scroll down to the next major heading.
2023 was a tumultuous year, for sure--lots of tech layoffs in the early part of the year as big-player after big-player announced whole trunks of the org tree no longer had a job. It definitely created some chaos and consternation among our industry, for sure, and that was all before the collapse of Silicon Valley Bank, which sent a huge ripple through both startup and established player alike.
By the way, as of this writing, despite multiple false starts, I'm still looking for my next great adventure (ideally as an Developer Relations or Engineering leader-of-leaders), so if you find my analysis here to be interesting or intriguing--even if you disagree with it--perhaps there's a role in which I can do this kinds of strategic and executive thinking on your company's behalf? Would love to hear from you.
... I wrote a lot of stuff. (That always seems to happen when I do these.) So let's get to it; as I do each year, I'll include the full text of what I wrote in each bullet point first, then put the Result after it with whatever insights or comments seem relevant. (Arguably none of them are, but hey, it's my set of predictions, so....) As always, evidence is highly opinionated, mostly anecdotal, and entirely from my (and some from my professional network's) perspective, so caveat emptor.
To start, I wrote:
First off, the pandemic hasn't left us, and CDC is starting to talk about masking up again. Despite desperate efforts to pretend we're clear of COVID (and related viruses), we're not, and it's going to continue to rear its head in ugly ways going forward. I don't think we're going back to mandatory stay-at-home policies, mind you, but it's going to be a couple of years of really bad winters, medically speaking. The "taper down" I talked about last year definitely seems to be in full swing (or slope, as the case may be).
Masking is still a thing in much of the Pacific Northwest, and they're still required in most medical offices, but beyond that, COVID or no COVID, people have made their choice. Aside from the performative video piece, nobody is going around ripping masks off of other peoples' faces. Periodically, somebody catches COVID and is down for a week (minimum, usually two), but that seems to be a risk we're willing to accept as a nation.
Secondly, even though the Ukraine conflict continues, and inflation reared its ugly head during the end of 2022, the economy doesn't look as bad as we thought it would six months ago, and I think things are going to "perk up" once we get through the holidays. 2022 sucked, but there's a lot of reasons to think 2023 won't be nearly as bad. (Of course, we thought that about 2022 compared to 2021, and about 2021 compared to 2020, so....)
Statistically, the economy has recovered. Inflation is coming back down, unemployment is actually back to reasonable numbers, and the stock market is doing... better. Nothing is back to its pre-2019 levels, but frankly, we'll probably never be there.
Meanwhile, the war in Ukraine continues, and a new one has opened in Gaza. Regardless of which side of the political spectrum you're on, everything feels terrible, and the last time that happened, Jimmy Carter was President. Are we due for another "Morning in America" moment? Probably not--we're all too cynical to the media for Reagan's watershed speech to work again, but something is going to give soon, and we will all rush to embrace it, because it just doesn't feel as good as it should right now.
Meanwhile, on the tech front....
"Blockchain-related losses are going to be huge tax write-offs. It's already starting, and other companies will follow suit as soon as they see others doing it." Going to give myself a +1 on this; most big-tech firms have divested their blockchain development efforts (and advertising), and the only ones really left pursuing it are startups (most of which seem desperate to exit from what I can see). Blockchain is pretty much dead as a tech, it seems, unless you're so deeply wrapped up in it that you need it in order to survive. Tally: +1.
"***Blockchain-related businesses are likely to be the target of lawsuits.***" Well.... Sam Bankman-Fried comes to mind immediately, not to mention most of his inner circle. That said, there were fewer lawsuits than I expected, particularly against the media personalities that shilled for crypto (Ashton Kutcher, Matt Damon, etc), but I probably shouldn't have expected them in the first place--a friend pointed out that the famous almost never get sued for hawking a bad product. Still, lawsuits and jail time, I'll take the (debatable) +1 here. Tally: +2
"The fallout from the cryptocurrency nightmare is going to stain 'Web3' for a good long time. .... Web3 is not going to be a fun place for a while." That definitely qualifies as a +1. Every one I know, be they developer, architect, tech manager, you name it, they're all staying well clear of anything "Web3" or "blockchain", even when the job market was at its worst. Reddit is filled with threads of curious developers asking about the "negative feelings about Web3" and most future-looking Web3 discussions seem to tie it closely to social media, which is also taking major hits right now in the courts of public opinion. Tally: +3
***Distributed systems start to embrace contract-driven development cycles.***" This one is harder to judge--OpenAPI hasn't reached quite the level of ubiquity that I would expect to call this a solid +1, but at the same time, there's a lot of groups/teams that are starting to look for better ways to interact with an HTTP-based API than the traditional "just build out the URL and HTTP header by hand...." approach. Lots of the contract seems to be getting buried inside official SDKs released by the API provider, which more or less provides the same kind of "contract" approach, albeit with a lot more work on the part of the API provider. Still feels like we're moving in the same direction on the ferris wheel, though, so... +1. Tally: +4
"***Microservices start to give way back to deliberate monoliths.***" Well, talk about interesting--not more than a week after I wrote last year's predictions, I wrote another piece, "You Want Modules, Not Microservices", and boy did that resonate--it got picked up by a bunch of news and journal services, and much of the feedback/commentary was... actually agreement (modulo the odd conspiracy theorist who is still convinced that monoliths are somehow the gateway drug to 5G nanobots in your blood controlling your mind at the every whim of Bill Gates and the Greater Dolphin Cabal of the North Atlantic). Several companies turned around and de-micro'ized their architecture back into a monolith, in fact, including Amazon. Turns out, all that traffic across the network bouncing between a bunch of services is bad! Who could've guessed? +1, bringing my Tally: +5
"***Security companies are going to need to OSS their offerings.***" Checking the Auth0 GitHub repositories.... Nope. It was a long shot, and it definitely didn't happen. -1 means I'm now at Tally: +4
"Work-From-Home (WFH) continues to normalize and become "just another location". (Probability 0.7)" Well... this one's harder. For each "It's Time to End the War on Remote Work", there's a "Remote Work is Still 'Frustrating and Disorienting' for Bosses". Studies continue, and likely will for the remainder of this decade, and while many companies are quietly just sorting it out internally, other companies are making great noise about return to the office (or just "work more hours", which seems to have had exacty the effect you'd expect), but when they do, it generates a ton of bad PR and the executives are usually obligated to walk it back. It looks like 2024 is going to (hint, hint) see a drawdown in the attempt to get people back into the office--but we'll leave that for the next section. For now, I'll give myself a +1 simply because it does feel like it's becoming more and more normal to expect a remote gig, and that's really the definition of the word "normalize". Tally: +5
"***Somebody is going to sue over state income tax.***" You can see where I was coming from on this one, but the SCOTUS' continued unwillingness to take a stand on it surprised me and continues to surprise me. Technically, I get a point for this one, since... well, tax law professor, New York, you get the picture, but this has been an ongoing fight since well before the pandemic, and it doesn't seem to have "ballooned" into the bigger fight that I thought it would be, so at best I feel it's a 0. Tally: +5
"***Low-code/no-code is going to accelerate further.***" A lot of the low-code/no-code efforts got shoved off to the side of the mainstream tech media in 2023 because of the incredible tech hype and media storm around generative AI and natural language models, so any efforts at advancement here kinda got derailed. However, the generative AI hype wave caught up a few low-code/no-code entrants along the way, and now you can build workable prototypes from napkin sketches, apparently. (No word on what happens if these AI-powered prototypes have access to the Internet, though.)
By the way, those Boston Dynamics robots are getting closer and closer to mounting guns, regardless of what the manufacturer says, enough so that states are considering outlawing them entirely. Because nothing stops a T-800 like a cease-and-desist order. (I kid, I kid. Actually, no, it's more "I cry, I cry", because it's going to happen whether BD wants it to or not.)
+1 means Tally: +6
"***Internal tech audits will become relatively popular.***" If it happened, everybody kept reallllly quiet about it, which you would kinda expect. But if it didn't show up in the tech press, did it really happen? -1 means Tally: +5
"***Muskian-style management will be studied, cargo-culted, and lead to the failure of more than one company.***" groan. sigh. Lord save me. +1 I hate that I was right about this one. Tally: +6
"***2023 will be the year we try to figure out what to do with all this AI stuff.***" Nope. Nope, nope, nope, nope, nope. -1 We haven't even figured out the difference between natural language models and generative AI, or the copyright implications of training a model on copywrit sources, or... pretty much anything beyond "Slap some AI on it, and it will sell!". We haven't figured anything out here, and it was horrendously naive of me to think we could in just a years' time. Tally: +5
"BONUS PREDICTION: 2024 will be the year that we start putting AI into some of our apps--like using ChatGPT for chatbots--and 2025 will be the year that we start ripping it all back out because we completely got it wrong. (We always do this. And I see no signs that we'll stop in my lifetime.)" OK, that one, I got right. But it was like candy from a baby--no points.
"***Companies will accelerate their drive to become 'tech companies'.***" A fair number of firms continue their "technology transformation" efforts, but nobody's really any faster or slower--or better--at it than they were before. -1 This was probably complicated by....
"***Hiring will accelerate at the end of 1Q/2023.***" I kinda think I should reset all my points back to 0, because holy crap, was I wrong about this. So deeply, incredibly, soul-wrenchingly wrong. A year ago, I actually wrote this: "Even as companies are engaging in layoffs, these aren't the savage cuts that we saw back in the Dot-Bomb era (circa 2001) or the Great Recession (circa 2008) era." Maybe back in 2022 that was a reasonable take, but then 1Q/2023 came around, and all that went out the window as big firm after big firm just pruned at their org tree, and did it savagely. This layoff tracker says it perfectly: "So far in 2023, there have been 1,992 layoffs at tech companies with 428,335 people impacted (1,183 people per day)." The graph shows 108,000+ in January 2023 alone. Tally: 0 So, so wrong.
"***Entry-level positions are going to be easier to come by.***" Ironically, this one actually seems to be taking shape. Even as a lot of companies laid off in 2023, many of them started hiring again almost immediately (2Q, 3Q), but almost exclusively for individual contributors (ICs) and many of those roles were entry-level positions. It still isn't a great job market by comparisons against the average from the 2010s, and definitely not by comparison against the pandemic era, but it's not hopeless if you're a tech IC right now. (Management, on the other hand.... that's where a lot of the struggle still lies.) I'll take the +1 since I'm desperate for points now, bringing us to Tally: +1
"***Platform-oriented development is going to begin making some waves.***" This one is a little tricky to judge. On the one hand, we can see companies developing platforms all over the place, particularly as exemplified by... well, pretty much every startup that's gotten a round of funding since 2019. But my prediction is a little vague, and speaking honestly, I was expecting to see a bit more of a push from multiple places in the developer ecosystem, which didn't happen. So... -1 takes us to Tally: 0
"***Cloud will begin to shift.***" Now, this one was interesting. When a major talking-head like DHH talks about moving off the cloud, it sends waves, and sure enough, companies began to do the math and realize that cloud companies are charging them at rates that generate (surprise!) a profit... to the cloud company. +1 See, that was always going to be the endgame here, because while the economy-of-scale graph does mean that you can generate more of a good/service per unit if you do it at larger amounts, that graph does eventually flatten out, and more importantly, multiple larger businesses began to realize that they were probably large enough to be able to take advantage of that same graph, themselves--once they'd put the initial investment in, they could, in essence, run their own private cloud, and pay even less for it than going through Azure, AWS, GCP, or any of the others. It will always be cheaper to make it yourself, once you've put the investment in to be able to make it yourself at scale. It's the same reason McDonald's makes its own buns, beef, and vegetables--it's the technology equivalent of controlling your supply chain. Tally: +1
"Elon will sell Twitter (perhaps involuntarily) at the end of 2023, to another tech firm (like Microsoft or Oracle). ... And that's how the grand MuskTwitter experiment will end. Not with a bang, but with a pathetic whimper."*** Well, it's the end of 2023, and... checking... nope, Musk appears to have no intention of selling off Twitter-I-mean-X-sorry. But the quality of the Twitter timeline keeps dropping, my friends and follows (the people whose opinions and insights I respect enough to follow) are slowly dropping off as well, and I'm about one more unsolicited DM from Nikki Haley away from doing so myself. So, first, I take a -1 on the prediction, but in the meantime, Musk continues to drain away everything that was actually good about Twitter. Tally: 0
"Meanwhile, alternatives to Twitter will continue to sprout up, but none will actually be able to be everything Twitter was, partly because Twitter was (at the time) unique in its approach to social media, and as a result it created a "perfect storm" that led the platform to its current (well, pre-Muskian) level of success. That will be nearly impossible to recreate." Yeah... +1. Sorry, Mastodon, Threads, BlueSky, and every other Twitter clone that's been released in the last year, but none of you are showing me the critical mass of "interesting mess" that Twitter became. It used to be that people I followed were also commenting and RT'ing on people they followed, and made it easier for me to find new and interesting voices to follow, and the clones all just don't seem to "get" that. I suspect, going forward, that before long my only social media presence will be LinkedIn and this blog, which will make me feel a little sad and alone, but frankly I'll probably feel a lot less depressed about the world around me once I do. Tally: -1
"RSS will make a comeback." Nope. -1 It hasn't, and it probably never will by this point. Sorry, folks, but RSS is essentially dead as a protocol of widespread adoption beyond where it's already in use now. Tally: 0
... leaving me with what I consider to be a completely missed round. Yikes. (I am really annoyed at how wrongly I read the room this time last year on the job market--for personal reasons, yes, but also because I really do pride myself on my ability to do these sorts of professional forecasts, and that was a huge misread.)
With that settled up, let's take a look at what I think will happen across calendar year 2024. As I've done with my predictions, I'm including a probability score with each one, in much the same way intelligence officials do in their assessment reports, to give you a sense of how confident I am in the prediction.
The interesting thing about the layoffs is that while they continue to trickle out from various companies all the way to December of 2023, it's a trickle compared to the flood that happened in 1Q, particularly in January. More interestingly, the kinds of people that are going to be hired will shift slightly--right now (end of 2023), companies are focusing on hiring ICs to fit into existing holes in their current org tree. But with the new year, and a bit of optimism in the market, they'll start growing new branches to that tree, and that will require more in the way of engineering management and "ancillary units" like developer relations, R&D, and so on. It'll take a while, and it'll never be like 2019, but before too long things will get stronger.
I haven't really paid much attention to WebAssembly for the past few years, because I wasn't sure where things were going. The 1.0 specification is done, but it's pretty minimal compared to other richer, more full-featured virtual instruction sets (like the JVM or the CLR). Truthfully, it's a ridiculously bare-bones spec. But, even as I say this, there's some interesting infrastructure that's targeting LLVM more and more (most notably LLVM and GraalVM), making it easier and easier for WASM to gently "slide in" to a deeper role within a dev toolchain and do some interesting things. That said, though, WASM still has yet to really demonstrate where it's real value lies--why write something to WASM binary, which generally means targeting the browser, when there's so many Javascript-centric options already? Where's the compelling value-add from the developer perspective? This is the hinge on which WASM's broad-scale applicability matters: If WASM can show how using language-X-compiled-to-WASM enables a new kind of feature or a faster/easier workflow to get to "done", then WASM could very well begin to emerge as the infrastructure backplane for browser-based applications all over the Internet. I don't know what that compelling reason is yet, though, which is why I leave this at a 0.5 p.
It already has started, in many ways--memes now are common across the Internet showing ChatGPT conversation windows in which it responds to the question, "How do I use GitHub?" with "In order to push code to the remote server, issue the command git branch
..." in a completely erroneous and mistake-riddled answer. The stories of the lawyers who used ChatGPT to write legal briefs that turned out to be filled with inaccuracies and errors--for which they were disbarred, by the way--have cast some serious doubt on the ability of these generative AIs to "fire all the peons" whose work ChatGPT and its ilk were supposedly going to replace.
It happens with AI every decade or so--something new comes along, it takes our imagination by storm, and we start chanting "AI! AI! AI!" like an amped-up-on-booze-and-caffeine-pills crowd at a wrestling match. Then, as we get disenchanted with the results, the various AI firms--who have yoked their success to the success of their AI work--will start to point out that they were never "AI" companies, they were "large language model" companies and therefore never a part of the hype machine that's now augering in.
Making use of a large company's model for your natural language analysis needs is great, but over time, companies are going to figure out that the cost of that model will probably be smaller if it can be hosted locally, rather than constantly going to the cloud and back again. (See the economies-of-scale discussion earlier, as well as the performance and reliability implications, a la "The Eight Fallacies of Distributed Computing".) Slowly, quietly, companies with some room in their budgets are going to start looking to develop their own models, and in many cases may find that the model's they're building don't need to be quite so large (and heavy), that in fact a "medium" language model, or even a "small" language model would work, allowing for a local presence on the same node as the rest of the processing. OpenAI and other firms are going to combat this by constantly releasing new models with new excitement and fanfare, but like the cloud itself, the basic path here will be "start with the hosted LLM, then as your success and budget grows, look to build--and tune--your own". It'll be "Buy vs Build" decision-making, all over again.
I've been saying this for years, but I keep holding out hope. Frankly, I think some of the development will be to raise the abstraction levels of what we do, because ChatGPT's success at writing code is due to the fact that 95 times out of 100, we're writing the same basic CRUD crap over and over and over again. The Pareto Principle holds here: 80% of all mobile applications (as an example) are doing the same stuff, it's just that it's a different 80% from all the rest. A language/stack that can embrace an onion-style architectural approach (allowing for "trap doors" to drop down a level of abstraction when/if necessary, such as how C/C++ allowed for inline assembly way back in the day) will be the answer here.
The other element I imagine will/could be interesting to explore will be the intersection of what natural language models and compiler front-ends will look like--if we can get a language to start looking more like a natural language, it might enable some interesting DSL-type scenarios for end-users, and reduce the demands/pressures on developers to respond to every little change.
The demand for the "full-stack developer" is mostly a cop-out on the part of managers, who don't want to artificially restrict their employee search because "hiring is hard". So, instead of carefully considering what skills the current team lacks that needs shoring up, they instead scribble "full stack developer" on the job description, and proceed to rattle off every single technology that the team has ever used.
If AR/VR is going to achieve anything of any substantial value, it really needs to do so soon, before it is relegated to the "Interesting Things That Were Out Of Their Time" bin, like the Apple Newton and the Xerox PARC. This year might--might--be the year it presents us with something, particularly given the rise in interest in wearables. Certainly, with Meta going all-in on the whole "Metaverse" thing (whoever came up with that should be fired), there's going to be no shortage of money thrown at the problem, the big question is, will there be any actual innovation around UI? Because so long as the AR/VR thing is solely game-centric, it's fairly safe to assume that the AR/VR will go right into that bin without too much hesitation. I know, games are huge industry (measured in the trillions now?), but it's not enough to support hardware; witness the difficulties even conventional gaming hardware (for example, joysticks--today's flight sims are vastly more complicated than the ones twenty years ago, yet nary a joystick to be found) has had over the decades. If AR/VR (which requires hardware specific to it) is going to reach a point that justifies buying specific hardware for it, it has to be for something more than just gaming (and please don't say "education", because if there's one place in the world that has almost no budget, it's education).
More and more vehicles out there are coming with computers in the console, offering easy connection to your mobile device. What's been lacking has been any sort of apps for it, beyond the basics (maps, audio entertainment, etc). Now that there's more of a target market, perhaps we are reaching a critical mass of potential customers to justify the investment in building apps specificall for vehicles.
If this does start to take hold, it will be interesting to see what sorts of apps get built (I could imagine some CRM apps for certain kinds of salespeople, for example) and what form user interaction takes hold (voice control and interaction would be very important for drivers, for example). Frankly, though, the hard part will be the core innovation itself--what sort of apps do we want when we're driving?
This one is a hard one to call, but I'm flipping a coin and landing on the pessimistic side of the "Will Kotlin break out of Android-only?" question. For those who weren't following the Kotlin story at home, Kotlin has recently done a couple of things to make it more than just "that language you use for Android" by developing Kotlin Multiplatform and Kotlin Native. These open up the Kotlin language for use in more situations, but their success will really hinge on how many developers actually want to use Kotlin in more places than just their Android codebase.
Historically, multiplatform languages have not done well, with the sole (arguable) success being that of Java--whose "Write once, run anywhere" campaign didn't really accomplish much. (Remember, most of Java's success was in the server room, not the desktop.) Native might have more possibility for success, but either one gaining any traction would be an interesting development and potentially grow Kotlin beyond just "an Android thing".
Both languages have reached a point where the weight of the language is really beyond the ability of any one person to keep track of what's there, yet each time a new release comes out, it comes with a whole slew of new proposed features designed to bring new syntax into an already-existing part of the language or an attempt to cram yet-another-interesting-idea into an already "overloaded-collection-of-features"-oriented language. (Can you really look at either C# or Java and tell me that they are "object"-oriented anymore? You can go quite a ways with either and never see or hear an object anywhere along the way by this point.)
Look, maybe this is my "Gitoffamahlawn" moment to these predictions, but both of these languages are old enough to drink, they each spun up powerful and resilient platforms that have a number of newer, simpler, more-concise languages, and at a certain point in time, it's reaasonable to say, "This one is done. There's nothing new we need to add." To keep cramming new stuff in is a terrible addiction--it's a way to show that "We're still hip! We're still doing cool stuff (...and therefore justify why companies should still be paying us license fees!)" We keep telling startups not to do it, but our two major providers of software development tools can't seem to get the lesson themselves.
Spend some time with F#. With Clojure. With Kotlin. Play with some of the new ideas (Mint, Wasp, Io, Alloy). Or even go back and experiment with some of the classics (Smalltalk, Self, LISP, Scheme). But, for your own sake, stop enabling the relentless pounding of the feature surf on the beach of your brain by breathlessly hanging on "What comes next", because in time, you'll be battered into sand.
Look, faith in cryptocurrency is really at an all-time low, except for the people who are caught holding the bottom of the Ponzi pyramid, and they desperately want to hype the hell out of it so they can pass the bag on to the next level down. That won't change. Blockchain, the world's worst-designed distributed database, will continue to wrestle with the goal of finding a reason to exist, and hey, could maybe even find one bullseye, given enough darts. (Many companies that invested in Blockchain have either decided to walk away from the dartboard or else that they're going to load up on a lot of darts--and maybe a shotgun or two--until they hit that bullseye.)
I really would love for these two to just go away entirely, but they won't, because nothing ever dies completely. But this is the last year they're going to get much--if any--serious time in the tech press, despite the fevered efforts of a precious few on LinkedIn. (Yes, Andreeson-Horowitz is going big into it; yes, they've been big into it since it's start; yes, there will always be people who see what they want to see, instead of following actual evidence; and yes, folks, there's still time to dive in and turn your profit, just don't be the last one holding the pyramid up!)
And, for me, 2024 will be the last year I talk about this in any form.
Thanks, AI! With all these generative natural language tools (like ChatGPT), it will be easier than ever to generate emails intended to trick users into surrendering their credentials to fake sites. That will lead to more breaches, more private info leaks, and more of all the wonderful things that come with those.
What will make it even worse will be all the security companies that roll out "AI-based" security tools, claiming that the AI will somehow be able to better protect against those very same attacks--but while generative AI can create really good-looking human-facsimile output, it's not always great at recognizing artificially-created human-facsimile input. Which means the attackers just got a boost to their toolset, and the defenders will be looking to try and keep up this year.
... as opposed to releasing something drastic. There is some interesting ideas percolating over in the world of DBOS, spearheaded by one of the principals in the Postgres world, but I have a feeling that it'll take a little bit for those ideas to brew before there's really something there to evaluate. (Hope I'm wrong, though.)
Meanwhile, though, players in the RDBMS world will slip a new feature into the mix by polluting the relational model just that wee bit more, the NoSQL players will pick just a tiny bit more from the relational world and add that, and developers will continue to choose which player they want to play with based on criteria that touches maybe 1% of the feature set of the player they choose.
I and a few other folks are working on a book based on the Developer Relations Activity Patterns catalog that I drafted a ways back, expanding on the material there in a big way. That'll come out in 2024, almost guaranteed.
Fifteen predictions for the next twelve months. Let's see how I do this time.
Talk to you all next year... if not sooner.
Last modified 01 January 2024