Native Apps

By Mark Nuyens
8 min. read📱 Technology
TL;DR

Why haven't progressive web applications (PWAs) not taken off the way they should have?

I've followed Apple's practices for many years, and time and again, I reach the same conclusion: web apps have the potential to simplify many aspects of our digital lives. In previous blog posts—Web Apps, Apples & Oranges, Web OS, Regulating Tech, Notification Noise, and most recently The Future—I’ve shared my vision of a world where progressive web apps (PWAs) play a role equal to that of native apps. For those unfamiliar, a web app is essentially a website that functions like an application but runs within a browser. Native apps, on the other hand, are built using proprietary languages from Apple or Google to process code and display interfaces.

Recently, I had to use a mobile app to scan an electric scooter called Bird. Although it’s a helpful app, I hadn’t used it in a while and had to download it again. Fortunately, I had enough space on my device, but there have been times when my phone’s storage was full, preventing me from installing anything new. This made me wonder: why should we need to install anything at all when the same interface could be accessed through a browser? It’s just a matter of developer support, but it would greatly enhance accessibility.

While the difference between native and web apps can often be so subtle that users hardly notice, some argue that native apps have superior capabilities, particularly when it comes to sensor support and speed. However, I believe that native apps often fail to fully utilize these sensors, resulting in a user experience that is nearly identical to that of web apps. The debate over speed and performance is also becoming less relevant, as the difference is now so minimal that it’s almost negligible. Looking ahead, the rollout of 5G networks is likely to further enhance the performance of web apps and websites, promising near-instant rendering speeds. This brings us to the question: why haven’t web apps become more popular, mainstream, and commonly used?

Some argue that native apps have the advantage of offline functionality, but I would disagree. Even Apple’s own Weather app lacks offline support, a feature that would be highly convenient. Imagine if the app displayed the last available forecast data when offline, while informing me that current estimates aren’t real-time. Instead, it shows nothing but blurred blocks or placeholders waiting for data. There are many examples of apps where offline support would be beneficial, yet instead, they often refuses to load without an internet connection. I mean, then why offer a native app in the first place? In contrast, even progressive web apps often handle these scenarios better. They rely on established standards that allow browsers to manage offline situations effectively, provided the developers have implemented support. It’s remarkable that native apps are often considered less user-friendly compared to web apps, yet Apple still doesn’t seem to actively promote them.

While Apple does offer the ability to add websites as web apps on the home screen, this feature is not prominently displayed. One would need to know there to look inside of the browser, if it's even there at all. More importantly, Apple doesn't seem to consider web apps to be applications. Most users look for apps in the App Store, so imagine if Apple added a “Download Web App” button alongside iPadOS, watchOS, and iOS. The download would be instant, requiring no data transfer or even an internet connection, as it’s simply adding a shortcut to a web address. Over time, I strongly believe users would prefer the web version over the native one. Web apps are not only lighter in storage but also more up-to-date than their native counterparts. Leveraging browser technology could unlock exciting new features or layouts that might be harder to achieve using Apple’s proprietary programming language, Swift. This would ensure a consistent user experience across all devices—whether a laptop, tablet, TV, or phone—with the same features available everywhere.

So, why doesn’t Apple fully support web apps? Wel, in some ways they do, but they wouldn’t be very pleased if developers suddenly offered only web apps and users stopped downloading native apps. The reasons are mainly commercial: Apple profits from native apps through sales and subscriptions. Moreover, native apps are more tightly integrated with Apple’s ecosystem, enabling features like sharing via social apps and moving files into apps like Dropbox. While these limitations could be overcome through standards, it would require active cooperation from Apple—something they’re unlikely to offer. After all, they have no incentive to do so, other than making it easier for web developers like myself to integrate web applications more seamlessly. Given how web standards have evolved over the years, I’m confident we could make these compatible with native features like photo or file sharing. It just requires the existence of these standards and support from vendors like Apple. Currently, there’s no regulation or incentive forcing Apple to take this step.

Apple appears to prioritize native apps over websites and web apps, subtly guiding users toward their preferred ecosystem. A good example is Apple's allowance for websites to include a piece of code in their markup that automatically redirects users from the web to the corresponding native app. While this may seem like a choice made by app developers, it's evident that Apple has facilitated this behavior, steering users away from the open web and toward native applications. What’s more, users have no straightforward way to limit or disable this automatic redirection. Once redirected, any purchases made within the app contribute to Apple’s revenue, as they take a cut from transactions within native apps. This suggests a clear preference from Apple for interactions through native apps, restricting user choice and potentially disregarding user preferences.

Moreover, Apple further entrenches native apps by preloading a range of default apps onto its devices, apps that users never explicitly askef for, such as iTunes or Siri. These apps are presented as essential parts of the iOS experience, yet users are rarely informed that these are optional and can be removed. While Apple might argue that these pre-installed apps offer convenience, the reality is that most users, trusting the authority of Apple or fearing they might disrupt their device’s functionality, are unlikely to delete them. This approach ensures that Apple’s apps remain dominant in a crowded marketplace of competing applications. By not making it clear that these default apps are neither necessary nor irreplaceable, Apple essentially lays the groundwork for building a walled garden. This walled-garden strategy works by creating and sustaining an environment where users are subtly encouraged to remain within the company’s ecosystem, often without fully realizing they have other options. The preloading of default apps is just one method Apple employs to reinforce this ecosystem, subtly ensuring continuous user engagement with its products and services.

Another important question worth considering is why developers are so eager to support a platform that essentially doubles their workload. They must create apps for iOS, Android, the web, and possibly even desktop platforms. This significant development gap has helped iOS and Android maintain their dominance—not because they are the best, but because developers lack the time or resources to develop for any other platform. Imagine having to rewrite a single feature for more than five different operating systems; it’s simply impractical. However, app developers and companies continue to support native apps for at least one compelling reason: the ability to send notifications directly to users. While web notifications are possible, Apple has made this process particularly challenging by obfuscating its documentation. Another reason for supporting all of these platforms is merely about availability; not being listed in the app store would almost be like you don't exist or you don't take your business seriously (which, of course, is often not the case).

Finally, It’s remarkable that regulators haven’t addressed the broader issue of comparing native and web apps, as they should. Hopefully, they are actually aware of this and, through supporting web standards and organizations like the W3C, are working on a solution behind the scenes. As I suggested in my article Regulating Tech, this issue should be addressed from the ground up, not from the top down. Instead of altering existing solutions from tech giants like Apple, regulators should incentivize users and companies to adhere to global web standards that enhance consumer applications, especially on mobile. If there’s one thing the web has going for it over Apple with its trillion-dollar empire, it’s that the web is larger and involves more people in making it a democratized space for exchanging ideas, technology, and solutions. By staying true to the mission of making the web a better place, we can ensure it delivers the most value to consumers and prevents companies like Apple from dictating what is and isn’t allowed on personal devices.