We've written here before about how native applications are king because they provide the best user experience. A recent report focused on answering the question "Is the web dead?" seems to back up a shift towards native apps over a more traditional web experience, but it also seems to indicate that there is an accepted difference between the two. The results of the survey, which was run by the Imagining the Internet Center at Elon University are summed up in a report from Pew Research Center’s Internet & American Life Project, and are somewhat interesting even though it was a relatively small sample set (around 1000 respondents).
The quick summary of the study is that 59% of the survey respondents feel that the World Wide Web (meaning websites, not internet-enabled services or apps) will be a bigger part of our life in 2020, and that apps that are created will be for specialized use cases on specific devices, just like it is today. Additionally, the same people felt that the web is and will remain the dominant platform in people's lives.
What's interesting is that these numbers represent what the people hoped would happen, not necessarily what they thought would happen. At that point, it makes sense that 59% of people hoped things would remain the same. We resist change, so it stands to reason that six out of ten people would hope to maintain the status quo.
One the flip side, 35% felt that native applications would take over (using the Internet, but not the World Wide Web) and be used for work, play, communication, and content creation. They also agree that the applications will be perceived as higher quality and more secure, and that the web will become less important and less useful as more and more people use apps.
My problem with apps (from a consumer standpoint) for singular tasks is that you only have access to only a specific set of data. The web is wide open, and if I want to search for restaurant reviews, I'm not limited to whatever Yelp or Urban Spoon has for me. So, based on that, I'm siding with the 59% group for the status quo.
On the other hand, if you look at how I currently use my desktop. it's almost 100% native applications (even if they depend on cloud or web services). They all use the internet to work in some way or another, but they are installed and executed locally because I like the experience better than their web counterparts (if they have any). TweetDeck and Evernote are good examples. They work just fine in their respective cloud versions, but I prefer the experience of the native apps. Still, for reading the news or shopping or whatever, I'm using the web.
On my phone and tablet, I'm just about the same. I don't use either for social networking that much, but when I do I'm inclined to use native apps (GPS functionality, camera integration, etc…). The same goes for note taking with Evernote, or accessing remote desktops (because HTML5 solutions are good, not great). There's apps for everything, and that would mean that I side with the 35% (and leaves me wondering about the other 6%).
So what gives? How can I be so firmly in both camps? The answer is easy: the user will go to the best experience. If someone makes an HTML5 client that beats out Wyse PocketCloud and lets me connect to any environment, not just ones with a gateway in them, I'll use it. Until then, I'll make sure I have this natively installed app. If they make an app for browsing Amazon.com that's as good as the web interface, I'd use that (the one they have is close, but no cigar).
Maybe that's just me, though. A Flurry Analytics study shines light on a trend that shows that people are now spending more time using native applications than they are using the web:
That's certainly because of the user experience, and it could be potentially alarming to companies that are trying to decide on a mobile application solution for their employees. If your users expect native apps and native performance, you could be stuck writing, maintaining, and supporting several versions of applications for different platforms, not to mention expending lots of resources coding for all the different devices that are running those platforms.
That's why I think, at least as far as enterprises go, there isn't going to be any growing shift towards native applications (again, I'm only talking about the enterprise). For evidence of this, I looked further along in the Flurry Analytics report to a chart that shows what people use their mobile devices for. The results aren't all that surprising:
Users are playing games on these things. This chart isn't considering native or web-based games, but it is showing the primary uses for consumers. "Work" is not on that list, and presumably takes up a fraction of the "Other" category. It also indicates that the bulk of the time spent using mobile apps is on gaming as opposed to other things.
So, if you're an enterprise and considering your options for deploying applications to any mobile device, web-based is probably the most enticing option, depending on the app requirements. Obviously if you need specific access to hardware you might be stuck with native apps, but even some hardware is being exposed via the browsers (location awareness, for instance).
Back to the original question that the survey attempted to answer, "Is the web dead?," we can answer "absolutely not." There will be a balance of web vs native apps into the next decade much like there is today. It may give a little to one side or the other, but I don't believe that any one approach will take over. Not while there are so many difference between them, and not while there are so many use cases out there to satisfy. I believe that native apps will have their place, and the web will have it's own, which I guess puts me somewhere in that other 6%.