11
Aug 16

Baseline – Keeping Up With Digital Disruption

Key takeaway – The reality is that clouds and IoT technologies are now synonymous with digital innovation and change. Without them, it’s impossible to take various processes and workflows to a higher level and achieve performance and cost gains that are now critical for success.

Only a quarter of companies surveyed are investing in the cloud, and a fifth are focusing on the IoT. As a result, many of them are at risk of being disrupted.

The pace of digital change is clearly accelerating. For business and IT executives, all of this translates into huge challenges—but also enormous opportunities. Individuals who can innovate, disrupt and reinvent businesses and industries will emerge as the leaders in the new economy.

A recently released technology adoption report from TD Bank includes a number of technology trends. It recently surveyed CEOs, CFOs and company founders at the Bloomberg Breakaway Summit in New York City and found that:

More of the Baseline article from Samuel Greengard


10
Aug 16

ZDNet – Half of all cloud services outside of IT departments, but IT is getting wiser

A new study from the esteemed Ponemon Institute says we still aren’t doing nearly enough to protect enterprises in the cloud.

For starters, the survey of 3,476 IT and IT security practitioners, commissioned by Gemalto, a digital security vendor, finds that half of all cloud services and corporate data stored in cloud are not controlled by IT departments. So, there’s a lot of cloud activity among business units that’s potentially not vetted or governed.

However, IT departments are getting a better handle on things, the survey also shows. Fifty-four percent of respondents are “confident” that the IT organization knows all cloud computing applications, platform or infrastructure services in use – a nine percent increase from a similar survey from 2014.

The survey doesn’t spell out how and why IT is getting a better grip on shadow cloud adoption. It may be assumed that there are more policies in place and greater communication and collaboration on best practices. IT may be getting more active in its evolving role as cloud broker or service provider to the enterprise, providing catalogs or directories of vetted services available to business users.

More of the ZDNet post from Joe McKendrick


08
Aug 16

IT Business Edge – Tread Carefully into the Mission-Critical Cloud

The initial phase of the cloud transition is nearly done, with more than three-quarters of enterprises pushing at least a portion of their workload to public infrastructure.

As expected, however, most of this is non-critical data and applications and is largely limited to storage and backup services rather than production workloads. So it stands to reason that the next leg of the cloud journey will involve mission-critical workloads – the stuff that sets the corporate suite’s hair on fire if it should cease to function for any reason.

This is why the growth of cloud computing is likely to slow down some as we approach the next decade. It’s not that the enterprise is growing tired of the cloud or is starting to see more of its flaws (yes, the cloud does have flaws), but that future deployments will have to be handled with more care as the stakes get higher. Not only will cloud services have to be more resilient going forward, but they will be increasingly optimized from the ground up to suit highly targeted processes, which takes time and coordination between users and providers.

More of the IT Business Edge post from Arthur Cole


03
Aug 16

ZDNet – Today’s cloud computing projects are missing something – a strategy

Everyone at some level is exploring or considering public cloud options for a range of functions — from automating IT functions to enhancing business processes.

The survey of 500 executives, published by Softchoice, finds a lack of strategic thinking when it comes to cloud implementations. A majority, 54 percent, report their teams struggle to form an effective cloud strategy, and 52 percent lack a formalized cloud strategy altogether.

Having a cloud strategy makes a big difference, the survey suggests. Compared to IT leaders with no public cloud strategy in place, those with a formal strategy are less likely to grapple with cloud skills gaps, the cloud procurement model, and cloud budgeting. Fifty-eight percent of companies without strategies have experienced cloud failures, compared to only 22 percent of strategy-minded organizations. Seventy-five percent say they are struggling to find the right skills, for example — compared to 41 percent of those with strategies. At the same time, while 70 percent of companies without strategies ran over budget, only 52 percent of those with strategies have had such issues. If anything, transitioning to public cloud is a slow-moving process for most businesses. A new survey of 500 IT and business executives finds 61 percent “still experimenting with or making limited use of public cloud”.

More of the ZDNet article from Joe McKendrick


02
Aug 16

The Server Side – Managed services model addresses cloud-based analysis paralysis

It can be a tad disconcerting when a popular trend pushes its way through the industry and you and your organization are yet to jump on the bandwagon. For enterprises that haven’t yet moved their applications onto the Azure, Google or Amazon cloud, it would be understandable for managers and C-level executives to be questioning both why it hasn’t happened yet and when it actually will. But according to Jordan Jacobs, vice president of products at SingleHop, the Azure, Amazon and Google cloud models are being oversold, and for many core business functions, a managed services approach to application hosting is often a better model.

Public clouds vs. managed services model

“The thing that wows a lot of people is the market share discrepancy between public clouds and managed services, especially when compared to the press each one gets,” said Jacobs. “Amazon, Azure and Google get all of the press, but they’re actually only about a third of the managed services and managed hosting market.”

Unfortunately, the love affair the press is having with the dominant cloud providers is causing a great deal of consternation with decision makers. On the one hand, decision makers feel that they need to catch up with the latest trend; on the other hand, they are having a hard time rationalizing, in terms of cost efficiencies, security, and business benefits, the porting of their core business applications into the public cloud. It’s creating a sort of analysis paralysis, where decision makers are unsure of whether using the public cloud is the right move, whether the managed services model makes more sense or if they should just keep everything on premises.

More of The Server Side post from Cameron McKenzie


01
Aug 16

TheWHIR – Nearly Half of All Corporate Data is Out of IT Department’s Control

Many organizations are not responding to the continuing spread of “Shadow IT” and cloud use with appropriate governance and security measures, and more than half do not have a proactive approach, according to research released Tuesday. The 2016 Global Cloud Data Security Study, compiled by the Ponemon Institute on behalf of Gemalto, shows that nearly half of all cloud services (49 percent) and nearly half of all corporate data stored in the cloud (47 percent) are beyond the reach of IT departments.

The report is drawn from a survey of more than 3,400 IT and IT security practitioners from around the world. It shows only 34 percent of confidential data on SaaS is encrypted, and members of the security team are only involved in one-fifth of choices between cloud applications and platforms.

IT departments are making gains in visibility, with 54 percent saying the department is aware of all cloud applications, platforms, and infrastructure services in use, up from 45 percent two years ago. Also, the number of respondents saying it is more difficult to protect data using cloud services fell from 60 to 54 percent, however those gains were offset by more broadly reported challenges in controlling end-user access.

More of the WHIR post from Chris Burt


27
Jul 16

ITWorld – Disaster recovery in a DevOps world

Organizations that are adopting DevOps methodologies are realizing actual benefits from taking that approach.

According to a 2015 survey by IT Revolution Press in conjunction with Puppet Labs, organizations using DevOps deploy code 30 times faster than others, doing deployments multiple times per day. Moreover, change failure gets cut in half with DevOps and services are restored up to 168 times faster than they are at non-DevOps organizations.

Let’s focus on those last two points for a moment. One thing is for certain: Embracing DevOps also pays off from a disaster recovery standpoint, because the tools and procedures that you use to move applications from development to testing to production and back to development again can also be applied to failing over and recovering from disasters and service interruptions. The same tools that automate the entire DevOps life cycle can also help you make the most use of the resources you already have for recovery purposes.

There are indeed plenty of open-source tools to help with this automation, like Chef and Puppet, which create, launch, and deploy new virtual machine instances in an automated way and configure them appropriately. They even work across security boundaries, deploying on your private laptop, in your own data center, or even up in the public cloud — Amazon Web Services and Microsoft Azure are two major public cloud providers that support Chef and Puppet.

More of the ITWorld article from Jonathan Hassell


26
Jul 16

Arthur Cole – Navigating the Challenges in IoT Infrastructure

The speed at which the enterprise has embraced IoT infrastructure has been impressive. But with most deployments still in a nascent stage, many organizations are only just now starting to encounter some of the challenges associated with scaling up to production levels.

According to Strategy Analytics, nearly 70 percent of businesses have deployed IoT solutions, and that is expected to increase to 80 percent within the year. But as Datamation’s Pedro Hernandez points out, many organizations are struggling with the analytics side of the equation. While gleaning insight into complex environments is the main driver of IoT, it isn’t always easy to determine exactly how the analytics should be done. As the data coming into the enterprise mounts, so too will the complexity of the analytics process, which can deliver vastly different results based not only on what data is collected and how it is conditioned it but what questions are asked and even how they are phrased.

Perhaps not altogether surprising, the most effective use of IoT is not happening in the enterprise or in commercial operations but on the manufacturing floor, says tech journalist Chris Neiger. Recent research from BI Intelligence shows that industrial manufacturers are well ahead of verticals like banking, telecom and energy in their deployment of IoT solutions. The field is being led by General Electric, which is leveraging IoT for everything from industrial assembly lines to navigation and fuel management systems. Company executives say an IoT-supported Industrial Internet could contribute $10 trillion to $15 trillion to global GDP in the next two decades.

More of the IT Business Edge article from Arthur Cole


22
Jul 16

ManageEngine – Bimodal IT- Double the action, twice as fun

Christopher Reeve, Brandon Routh, and Henry Cavill are all big names and share one thing in common. What connects them is the fictional superhuman bimodal character they have all embodied. And who doesn’t love that character? He’s Superman. He can do it all.

In one mode, he falls well within most conventional norms and fits perfectly into a world of indifference and acceptance. In his other mode, though, he’s a symbol of change. He’s something the world has never seen before, and something the world agrees with. His kind of change is good. His kind of change brings hope.

Now let’s bring IT into this picture. What can IT folks learn from him? And how can they harness that hope? It’s simple—go bimodal. Stability is a must and change is unavoidable. But that doesn’t mean that both can’t coexist. In fact, Gartner predicts that by 2017, 75 percent of IT organizations will have a bimodal approach. In this approach, mode one is about legacy and predictability, leading to stability and accuracy. Mode two is about innovation and exploration, which lead to agility and speed.

More of the ManageEngine article from Ravi Prakash


19
Jul 16

Baseline – Cloud-First—Except When Performance Matters

Many companies have a cloud-first policy that hosts as many applications as possible in the cloud, but apps that are latency-sensitive are staying on premise.

In the name of achieving increased IT agility, many organizations have implemented a cloud-first policy that requires as many application workloads as possible to be hosted in the cloud. The thinking is that it will be faster to deploy and provision IT resources in the cloud.

While that’s true for most classes of workloads, those applications that are latency-sensitive are staying home to run on premise.

Speaking at a recent Hybrid Cloud Summit in New York, Tom Koukourdelis, senior director for cloud architecture at Nasdaq, said there are still whole classes of high-performance applications that need to run in real time. Trying to access those applications across a wide area network (WAN) simply isn’t feasible. The fact of the matter, he added, is that there is no such thing as a one-size-fits-all cloud computing environment.

More of the Baseline article from Mike Vizard