
The price of convenience: Why the cloud is a trap for you and app developers
Throughout the 2010s, we’ve seen the surge of apps that run on your browser. You simply log into a web page and you’d be able to make things do with your life. With that sparked the beginning of a dream: What if you could have your app running literally anywhere? With the web, that dream was made true.
However, it came at a bunch of hidden costs that even Big Tech is fully aware of, but the general public isn’t. And I don’t think it’d be too farfetched to say that there is a collective of app developers —particularly the most junior— who have yet to realize this trap that is affecting both users, maintainers and developers of these apps alike.
This is a cautionary tale for everyone about the upcoming future of apps that you can use on any of your devices.
Cloud computing? Web apps? What are you talking about?
If you already know all about this, feel free to skip this introduction by clicking here. Saves you a bunch of scrolling! You’re welcome.
Otherwise, keep reading along.

In the very beginning (we’re talking almost 100 years ago!), computers would run programs and applications from inside the computers themselves, and programs (software) would rely on the components inside them (hardware) to do whatever they’ve been programmed for.
This can range from calculators, operating sophisticated machinery, videogames and even programs that run or maintain other programs; not in an Inception kind of way, but as a manager and worker model. Although sometimes it’s the former too, through emulation or virtualization.
Because computers could be built for different purposes, software was usually made for a specific or a very select group of devices. Cross-platform and porting solutions would eventually appear, but until fairly recently in computing history, there was no builtin solution for developing the same application for all kinds of devices, such as your PC, wearables, and tablets.

But even then, what if you had some data that you wanted to have in all of your devices? What about posting videos publicly?
How the web became the cloud
This is where the Internet comes in: The system that provides scaffolding and protocols for devices to communicate with each other, no matter how far away they are. Under the Internet runs the World Wide Web, or the web for short.
When you make an operation through the web, you connect to another computer. Most often, it will be a server computer that will send some content that your own device (client) will display for you, like a blog post or a company’s website. But it can also allow you to send data in, which is the relying principle behind social networks.
All of this is handled by both server and client computers, and the software they run are what encompass a web application. The key insight is that, unlike regular applications mentioned above, the inner workings of web apps span more than one device. Regular apps can still connect to other places through the web from time to time, such as for sharing files through multiple devices or online activation of your favorite Adobe product, but most of the heavy lifting is exclusively performed by your computer, whereas on web apps, you request other computers to do it in stead of your own.
Cloud computing, broadly speaking, encapsulates all of the ping-ponging between the servers and the clients of a web app, in a way that can sort of manage itself and give insights to human users and developers. This means that web apps can save on resources when there isn’t much activity, and beef up when traffic spikes up. This is wonderful for startup companies because they’d have to put very little work on increasing resources as more people use the app.
Another benefit of web apps and cloud computing is that this allows an otherwise incapable system to delegate computing power to the server.
As a web app, the Legacy version of this blog uses a modern framework, but it’s capable of delivering content compatible with computers made all the way back in 1995. As for cloud computing, the former Stadia service and the “cloud” versions of many games on the Nintendo Switch come to my mind.
I will come back to this in a bit.

But setting up a cloud data center is very complex and expensive. That’s why the vast majority of web apps that use cloud computing rely on companies that specialize in making and maintaining their own and borrow them at a price. These are the cloud computing services that I will be referring as cloud providers from this point onward, and they are the main topic for this post.
After all, hiring the service is way more straightforward than:
- Budgeting how much hardware you need to make the most bang for your buck.
- Having to build a tech stack for a computer that you need to maintain to make it work continuously without blackouts, overloads, corruptions or hardware failures.
- Coming up with a system to update the app with new features in a way that doesn’t interrupt or break what the users are doing at the time you upload the update — some apps even selectively enable certain features for a group of users.
- Keeping your tech stack up to date to keep it safe from hackers and bots, while at the same time making sure that the updates don’t break your app because of breaking changes.
- Coming up with a system to get user diagnostics that will help solving problems with your app.
- In the event that there’s an outage, you also have to develop a system to restore it back in place with minimal losses.
And yes, that most definitely means learning a programming language like PHP and a completely different operating system than Windows called ‘Linux’, as well as more tech savvy concepts such as version control, containerization, unit tests, and more.
The traps of cloud computing
I’ve been a web developer for about 8 years now, and I’ve relied on cloud computing for spinning up a few personal projects. But it wasn’t until a little before moving my blog out of Netlify when I realized of the shortcomings that their cloud hosting services provided for my blog.
My relationship with Netlify was fairly simple and straightforward — I just needed a place to host my personal projects, and I wanted to showcase them for free, so the lower the hosting costs, the better. But I was aware of many of the services they offered: Image optimization, caching, edge functions… But I was just making simple static content. I only cared about the hosting.
So if I was able to host my web projects for pretty much free, why did I move my blog out? To put it simply…
The horrors of “lock-in”
This is mostly a concern for app developers. Feel free to skip this section if you’re just an user.
Since relying on cloud computing essentially defaults you to enrolling to a service, you must agree to whatever terms they have on their platform. I’m not saying that you shouldn’t be trying to break the rules, but you’re also agreeing to the fact that the cloud providers can (and probably will at some point) change those rules, whether you like it or not.
If you feel like a lightbulb just lit up on your head and thought: “Huh, this sounds a lot like something Adobe would do with their Creative Cloud subscriptions”, that’s a gold star for you! This is pretty much equivalent to the flex pricing models that Autodesk uses for their 3D software: You pay as you use their programs —or in this case, their data centers. Except it can get worse than Adobe.
A common meme in the web app development space is that code can end up on infinite loops that never end until you force it to stop. If that’s happening on one of the cloud computing datacenters, the costs could turn into hundreds of thousands all the way up to millions of eurodollars of usage, potentially bankrupting your business if you’re not careful. Thankfully, most of these cloud providers do have countermeasures that you can set in place to mitigate this.
But the thing is, having hundreds of thousands of users in your app is pretty similar to an infinite loop of operations within your app and its cloud computing network. It just runs comparatively slower depending on the size of your userbase. So as your app grows in size, the expenses of the cloud infrastructure go up so much that eventually, a cloud provider will be vastly more expensive than having to develop your own cloud computing service with the same traffic.
In fact, the lock-in situation is so bad that even cloud providers like Vercel, despite being perfectly capable of having their own data centers now, in the past they have relied on other companies like Amazon as the backbone of their service, and they could impose fees as high as they want should you wish to leave. And this goes even worse off for developers using Netlify or Vercel, because despite the fact that they’re using the same underlying infrastructure from Amazon, Google or Microsoft, they set it up in their own ways, making it harder to migrate data and configuration from one service to another.
They became popular simply because the higher up web services they’re reselling such as Amazon’s AWS or Google Cloud are too complex for the end user. If Amazon or Google decide to work on their services to become more accessible for the end user, it could spell doom for companies like Vercel and Netlify. As a response, these companies will take huge egress fees to ensure you stay with them.

And it’s not those companies: The Big Tech companies can also supersede or downright turn down their services at their will.
Alphabet Inc. has a long history of killing several of their widely used Google services. Microsoft has a similar history.
If your service depends on Big Tech’s, who says they’d keep it around forever?

In my case, I’m one of the bunch of people who used Netlify for the free hosting, so while leaving would force me to host my own blog myself, for a simple static blog it’s not that big of a cost if you start small.
And yet still, my blog isn’t still fully out of the cloud. Optimized image caches tend to grow pretty big, so I’m using an AWS storage bucket as part of my cache tech stack. I could have gone with an image CDN, but that’s still the cloud and it would force me to use HSTS (forced HTTPS), which I normally wouldn’t mind, but it would make my blog for older browsers pointless because HTTPS legacy support is dropping on many cloud services. Netlify does have an option to turn that off, but the looming threat that they could enforce HSTS on every Netlify site was my dealbreaker for this very specific niche project.
HSTS is still great, though, and you should have it enabled in pretty much all of your projects for safety and privacy reasons.
Generative AI is pretty much cloud computing
Generative AIs are bulky resource intensive pieces of software that you’ll hardly ever have in your own computer. Just downloading their language models can take hundreds of gigabytes or even terabytes of storage. There’s a reason why language models have an extra L in LLMs: They’re Large.
And even if you had that storage, you need a ton of power for running the thousands of computing operations needed to run them. Yet another reason to use thousands of power mongering GPUs for anything other than playing games.
Considering the obnoxiously large costs of running generative AI on your own, as a startup you’ll have no option but to rely on OpenAI, DeepSeek, or whatever may pop up in the future, to use those kinds of features. And it gets pricy pretty quickly, too.
So if any app you are using has an AI feature, you can rest assured there’s a datacenter behind it. Simply because setting up an AI is far too complex of a task for your everyday developer.
But I’m just a regular person. I don’t make web apps.
This should still concern you. Cloud computing makes developing a web app so easy, it hugely increases the amount of people willing to make an app without much knowledge on computer science, leading to lots of apps —several of them being basically the same— with questionable quality and security.
One of the main targets of cloud computing are startups backed by venture capital because venture capitalism is driven by growth; if your project makes it big, scaling up is nothing you should worry about.
However, venture capitalism carries a high risk of failure because startups are not consolidated companies.
And you know what happens when a company fails, right? They’d have to give away the datacenters they use. In other words, the apps developed by them you use will stop working. And if that app is tied to a specific product, that product would essentially become waste that won’t do anything.

That’s pretty much one of the main reasons why many apps these days have tiers, one of them being free of charge: To grow and impress investors. But these often come with certain costs that may be pretty well hidden, or throw some shenanigans like “your personal data is the product they’re selling”. Likewise, once you’ve signed up, it will be easier for you to get “converted” to a paying user through a subscription, the hottest trend in tech nowadays and a more widespread concern in the last few years. Cloud computing isn’t the source of the surge of subscriptions, but a catalyst.
But computing is not free. It requires power and hardware to run, and that costs money. As the number of free users enroll in whatever plan they have grows, so does the company’s credit. Even if they impose harsher limits on the service on free users, you’re still using it without being charged any eurodollar; credit is unavoidable.
Cloud computing is just one of the many signature examples of the Uhmerican economy being driven by a growth fetish fueled by credit of any kind.
And then you get kicked where it hurts most: Your debt.

All of these startups and not so startup anymore companies are basically operating at a loss. OpenAI is not expected to turn profit until 2029, and although Forbes has stated that Notion is “profitable”, it’s fairly small and they’ve just began using AI. I wouldn’t be surprised if that situation were to change soon.
It’s hard to find apps that don’t use cloud computing, but the telltale signs that one of them doesn’t is if these conditions apply:
- It doesn’t ask you to make an account to get started.
- You have to download the app and install it in your system.
- It still fully works if you’re not connected to the Internet.
If these conditions apply, then it’s highly likely that the app you’re using doesn’t use cloud computing.
Conclusions for developers
Simply put, don’t use cloud computing unless necessary. You’re definitely not gonna need one if you’re doing a todo list app. And you probably won’t need one for productivity apps of most kinds either. Or cookbooks. Heck, if your e-commerce is also localized, you might be fine with just a regular web app. No need for Shopify or WooCommerce.
Sure, this adds a significant amount of inconvenience, which is something that our ever increasingly demanding brains don’t want. But my life philosophy has taught me that willing to suffer a bit of an inconvenience will make you more humble and more grateful for everything, even if it fails. At the end of the day, would you be willing to pay millions just to make something one step less convenient? In the long run, this effort may pay off…
Don’t get me wrong, cloud computing is still pretty good for hosting prototypes or proof of concepts that you might not end up developing further into commercialization. But it can be easy to fall for the trap that this is cheaper in the short term, but not if it becomes a viral hit.
Cloud computing is most effective when you need to share things to the general public, too. So for things like YouTube and streaming platforms it’s basically their only way to share media effectively. But if you’re just sharing data with a select few devices or users, there may be another way.
Although very popular in the 2000s, they’ve pretty much disappeared off the map 20 years later, but the peer-to-peer networking technology is still there. With P2P, your computer becomes a server and a client at the same time. In other words, you use your computer for computing on the web instead of relying on someone else’s.
People still use them to torrent movies and games, but one could use P2P networks for file syncing between devices, removing the need for databases. An user would just need a small file with all relevant data in their devices, with maybe backups in the event of a conflict or corruption. It’s what powers things like games with multiplayer lobbies (Animal Crossing, Minecraft, and even some Call of Duty games used it), and some video calling programs used to use peer-to-peer. It may be potentially safer because it’s decentralized: If a computer is compromised, only them and those connected to that user may be at risk, instead of millions of users when hackers manage to breach databases of large corporations.
Regardless of what you use, you definitely should be concerned about what would happen should your service close down, and make actions to mitigate damages, something that the technological industry seems to have forgotten about these days.
I think it’s time to bring that consciousness of responsibility back.
