Guys. Don't listen to Smash. He doesn't know what he's talking about.
Cables? Do they drive their cable cart horses hard with their buggy whips? IT infrastructure physical build out is largely unskilled labor. Crimping cat 6 cable doesn't require a 4 year degree.
Yes and no. The cables kinda have to connect two devices. And those have to be configured. And the arrangement of switches/MDF/IDF/whatever and the cables connecting them have to be set up properly and according to some kind of a plan. Your statement is like saying that tightening a nut is largely unskilled labor so there's no value for anyone to get a job in the car manufacturing business. That's a gross simplification of all the work that is required and the different levels of skills needed.
What you hear at high level IS conferences these days is primarily the transferring of the client side stuff to devices owned by employees.
Anyone who's been in the industry for any length of time knows that about 80% of what is projected at that level is pure BS. That's an idea that some people who don't understand the technology *want* to happen. Not because it's useful, or increases productivity, or even reduces cost, but because it seems like a smart thing to mostly business people who use the technology, but don't know how it actually works.
In the real world, businesses are backing away from that model (if they ever even entertained it in the first place). Want to know why? Because it's 100 times harder (and more expensive) to manage a network full of random devices any person might walk through the door with. And that's before even contemplating the massive incubation of cross network viruses, hacks, etc that would inevitably occur. There's a reason why companies are maintaining standard lists of devices allowed more than basic access to their networks, and requiring the installation of security kits on them.
If you want to do more than browse a network and run very light client portal appss, the BYOD model doesn't work. It's more expensive, more complex, and reduces productivity. Server side management is still (and likely will be for the foreseeable future) the model used, with clients consisting increasingly of very light weight apps to access said content. There's no shifting of that load to client devices, let alone to personal devices. I know it may look that way from the outside, but that's really not what's happening. Making your server side applications accessible to a wider range of thin client software doesn't shift the load at all. And if anything, it increases the demand for higher tier IT. Fewer guys lugging computers around, and more guys setting up wireless infrastructure and maintaining servers with applications, and architecting portal/device application/abstraction layers.
If I log into my work network from my phone that I show up with, and data is held on a server, it costs a company a lot less in maintenance, primarily because they can hire a lot less (none ideally) local employees to maintain networks. The end game is a massive reduction in tech spending, offloaded to employees. Having a device that works will be like having a pencil.
Wrong. Because the company has to maintain a wireless network matching every protocol you might walk through the door with, and an array of server side apps to allow you to utilize the server processes and power and data from any such random device. And I'll point out again, that for any "real" applications, specially configured devices are often still used. Not a lot of EDA tools running on the ipad right now, and not likely to any time soon.
If the biggest app your company runs is email, that may go more BYOD. But then, that's not a tech focused company that the best jobs are at anyway.
But you know, flying cars, sh*t happens, who knows how it'll really be in 10 years.
Yeah. Stop listening to the folks in the trade mags. They get it wrong 99% of the time.