Sorry if this is the wrong place or something, but I saw a question in here on PHP vs ASP.NET, which made me think it might be. On the other hand, I also saw posts asking about people's breakfast, so I don't really know...

Users are already starting to embrace web applications such as Google Docs over their desktop counterparts. From a coding perspective, it seems like many languages are shifting gears to be more web-oriented. For instance, Java has embraced JavaFx, which seems more of a web-app library than swing was/is. Even more, it uses CSS for styling. There are tools for developing cross platform apps for iOS and android in HTML5 (Yes, I know that they're slower, not as popular, not completely platform independent etc. but still). I've also heard that Windows 8 development involves HTML and CSS (though I haven't looked into it that much).

I'm asking this from a skill-development point of view. I've been reading Objective-C and Java books, in which interface design is a nightmare compared to HTML/Javascript/CSS. As HTML/Javascript/CSS starts to invade desktop interface development and more web applications are built with HTML5, will knowledge of Java/Objective-C/C++ become secondary to knowledge of php/ASP.NET/HTML/CSS/Javascript etc. Will web skills be more important than desktop skills?

NOTE: By desktop skills, I mean skills that are used in desktop today (or three years ago). By web skills, I mean skill that are used in web today (or three years ago). I know it's a pretty vague distinction, but the skill you would develop today to program an app in Java are different than those you would learn to build the same app for the web (at least until recently) completely

Recommended Answers

All 14 Replies

Until web becomes 100% secure I don't see many people and companies relying entirely on it. It's difficult to believe companies will give up their company secrets to use web-based programs. Yes there is room for web-based programs, but there is also going to be a lot of PC desktop programs in the forseeable future. So if you are starting an education program pick the type of programming/scriptiong you like best because you can't go wrong.

I think they serve very different purposes.

I think that it is true that a lot of the wide-distribution applications for "light" work or for play are shifting towards web applications more and more. But that's because there are more and more people using the internet (duh!) and more companies and organization rolling out custom applications for some special purpose, and it is cheaper, faster and more convenient to release those on web platforms or languages or as iPhone / Android apps. And as a user, these are the kinds of applications you use everyday and all the time, i.e., they're in your face. There is no doubt that this market is big and growing.

But then again, most of these kinds of applications are pretty trivial. It is true that HTML / CSS / Javascript is really easy for making GUIs, easier to write and distribute than doing the equivalent in Java / C# / C++, but these languages are also limited to doing just that, set up a simple GUI with some trivial code (if any) running in the background. In other words, they don't do any heavy lifting. So saying that "desktop languages" will disappear is like saying that cargo ships should have disappeared after we invented airplanes, but the truth is, cargo ships are still the most effective means to ship large quantities of goods, regardless of how ubiquitous airplanes are today.

For instance, I work in the engineering and robotics field. Most of the work we do on computers involve either computer-aided design (CAD), various kinds of advanced simulations (i.e., "number crunching"), or programming control systems on small embedded devices. I don't see any of that being able to shift towards any kind of web development, ever, because it simply doesn't make any sense. These applications are not serving some John Doe user that is out-there on the internet, so it makes no sense for them to be facing the internet, especially not with all the issues and inefficiencies associated to that.

And then, after all this talk about these great web languages, and also counting the server-side languages (PHP, ASP.NET, Python, etc.) or high-level languages, you must not forget that these languages still rely on good old C / C++ code to do pretty much everything. Most of these languages are just nice easy-to-use wrappers around extensive C/C++ libraries. Even high-level languages like Java are being phased out of many important server-side companies because they need something faster to be the workhorse of their massive servers, and that language is, of course, C++.

So, you have to be careful with the natural bias you have towards things that are in your face everyday but aren't necessarily representative of the trends and realities that go on "under the hood". It's not because modern cars have more and more gadgets on the dash that you no longer need a team of qualified engineers to design the engine and other mechanical parts under the hood.

And you also have to factor in the level of skill too. Writing an application for a local public transportation administration to allow users to look up time-tables and "optimize" their travels has its challenges, but overall, it's pretty easy. By contrast, writing the back-end of a distributed database system is super hard. My guess is that the programmer doing the former and the one doing the latter don't earn the same salary.

And finally, I also share Ancient Dragon's concerns about security. I can not possibly see any company putting any significant amount of company work/data on these web-based (or cloud) applications any time soon. I don't think you realize how vulnerable these systems are. You can walk into a company server like you can walk into Grand Central Station. And while we're at it, the Windows 8 style of HTML / CSS apps are also known to be massive open doors for cyberattacks. The web development field has a lot of homework to do before any company in their right mind would entrust any sensitive data to these platforms (meaning, they can't get their workforce to adopt it as their main tool for work), not to mention that I don't think many companies appreciated finding out that all their private company secrets have been backed up by the NSA (mostly through private contractors, who are also in the business of selling information).

And finally, I also share Ancient Dragon's concerns about security.

Likewise. If you look at history, you'll see a distinct pattern of alternating between centralizing and decentralizing when it comes to data storage and retrieval. Right now with the whole Cloud nonsense we're in a decentralization period, but I'd wager that one good sercurity breach will find everyone scrambling to centralize again and secure local software will keep on keeping on. ;)

It is a mixed boat in my mind, and it all depends on who is what and who is where.

There is no point for a small flowershop sitting on a street corner to begin investing in their own server racks when an online storage solution would be cheaper and more appropriate.

On the other hand, it would make sense for a large company that has a couple of hundred employees, all using various things such as intranets, extranets, file servers etc. to purchase their own servers, and run it themselves.

When it comes to development for these systems, you shall always have desktop applications to do one thing or another. The majority of these online, "cloud" based systems still rely upon a user to create the work offline, whether it be through the use of a specialist application or using a basic word processor.
They all compliment themselves, however to conclude my views:

  • Web applications shall continue to be developed, and shall make it considerably easier for a home user
  • Desktop applications shall take centre stage in work and business environments, and for anything that requires more than typing text or moving a few images around, simply for their efficiency, security and for the end user.... familiarity
Member Avatar for iamthwee

With perhaps the exception of low level hardware controlled stuff and intensive stuff like 3d modelling/gaming. C++/c has its place, but web based apps will eventually become the norm, I think it will eclipse the need for desktop apps.

The cross platform guise of java and dot net isn't really cross platform IMO, maybe it was apple that killed it off - well anything involving some kinda framework, but I honestly believe html+jquery+serverside scripting is truly the future. It is scalable and works well for most tasks. Controlling apps from you desktops will become a thing of the past. Cloud based solutions is where it is heading.

3D gaming in the future may be web based, with all of this 'WebGL' stuff coming out. (OpenGL written in Javascript basically) It's not very logical atm but eventually I think that could be a thing.

Member Avatar for iamthwee

3D gaming in the future may be web based, with all of this 'WebGL' stuff coming out. (OpenGL written in Javascript basically) It's not very logical atm but eventually I think that could be a thing.

It's funny you should say that because I'd like to revise my original statement saying even computational intensive stuff can be cloud based. I don't think WebGL will take off because it will always be limited to the users' hardware, but if the hardware is cloud based I can see people using their home computers as dumb terminals/portals to play games.

The only issue it seems is the bandwidth bottleneck.

https://www.youtube.com/watch?v=hvZKmSvOxMY

The following video shows how web based farms (otoy) were used to play GTA off the internet. Also, otoy specialise in rendering off the GPU and they are convinced server farms of the future will leverage the massive parallel speed increases of GPU computations.

Watch this space. I don't think you'll see CPU server farms but GPU ones instead.

http://www.youtube.com/watch?v=uRdSxZtUpFk

Very good point, Microsoft is starting to do this with their new console, I'm interested to see how it goes.

The WebGL stuff makes sense to enable some rather simple browser-based 3D graphics, something that has been sorely missing for some time now, especially with the collapse of all the efforts for a 3D web standard of some kind (OpenInventor -> VRML -> X3D, and everything in-between and around that). Of course, it will probably remain fairly limited because it is (1) client-side and (2) hosted by a browser and a script interpreter. This severely limits the extent of the graphics that can be handled. But WebGL is going to be super useful for things from browser-based Google Earth kind of things to allowing better in-browser demos that use 3D graphics.

As for cloud-based 3D graphics, I don't think it makes any sense at all from an architectural point of view. It's all pain no gain. The buses involved in a computer architecture on the path between CPU -> GPU -> Screen are some of the fastest buses on a computer, typically in the order of 30-300 GB/s. And you're gonna replace that with... the internet, just so you can avoid the need for a piece of electronics the size of a pocket-calculator on the client-side. I think not. This thing is just a trendy sound-byte, it makes no sense in reality.

The only issue it seems is the bandwidth bottleneck.

Yeah, of course, bandwidth is always the issue, everywhere, in everything related to computers. If there were no bandwidth limitations on networks, on system buses, on memory read/write speeds, etc.., there would be no constraints whatsoever. The art of computing architecture design is figuring out the best way to distribute the bandwidth requirements. And computing the 3D graphics on the cloud and then streaming it down to a thin client is in the realm of extreme stupidity when it comes to efficient load distribution. And I can guarantee that "thin" client won't be that thin at all, given the bandwidth it would have to accomodate (even if it is just carrying it, not generating it).

PRAISE THE LORD FOR PEOPLE WITH COMMON SENCE! I tried explaining the security risks of web storage to people who use things like SkyDrive or iCloud or have spoken to a person with a fruit on their shirt and suddenly think they're tech gods.

Explaining why it's not the best place to store naked pictures of yourself (if that’s what you like to do in your spare time), or sensitive info like bank details, is like trying to fit a very large, square peg in a very small, round hole!

As you have all mentioned the cloud does have its obvious advantages and disadvantages. However although it is not as vulnerable as cloud storage, centralised storage isn't 100% safe, after all if someone wants something they will find a way to take it.

Obviously there will be more and more web apps appearing because more and more people are using the internet but as mike_2000 said 3D gaming simply need too much power and bandwidth to be worth it and tech needs to be developed a lot more before it's even worth considering but it will still never be as quick or powerfull as good old desktop rigs!

I agree. Cloud storage is not the best idea in my opinion, especially in the United States at the moment. Privacy in the United States is long gone especially the bad things I and many others foresee.

Member Avatar for diafol

Will web become more important than desktop programming?

What's your definition of 'important'? As in the criticality of the software or the popularity of the medium? I think you'll always need both. I can't see one being more important than the other. OSes of some flavour will always be required AFAIK and you can't browse the internet without some sort of browser. As Mike eloquently states, some things just don't lend themselves to web processing / packaging, but having access to online services via a web interface is usually hell of a lot easier than the alternative.

Pretty much anybody can learn to 'programme' for the web, but desktop software programming requires something a bit more substantial, as IME it's far less forgiving. This term 'programming' seems to have evolved slightly too. Now it seems, anybody who can stitch together a Wordpress or Joomla site with a couple of choice lines of php and third-party plug-ins and add-ons, can now call themselves a 'programmer' (or at least a 'developer').

Web based apps are usually developed keeping in mind that the user is always connected to the internet.But that is not the case everywhere,is it ? Also,I personally don't trust the cloud to store data and with the recent revealations about PRISM and all,intellectual data isn't safe on web servers.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.