21
Jul 16

CIO Insight – Why IT Departments Lack Diversity Programs

The majority of IT departments and their organizations are doing relatively little to increase workforce diversity, according to a recent survey from TEKsystems. Very few tech pros and leaders, for example, said their company has a formal diversity program in place. They admit that they struggle to find quality talent to fill open IT positions, but they don’t often consider diversity in recruitment efforts—ignoring the value of existing diversity programs which could help close gaps.

“While IT departments struggle to find qualified IT workers for their teams, our data indicates that most have yet to leverage diversity programs to help solve that challenge,” said Michelle Webb, director of diversity and inclusion for TEKsystems. “In our conversations with clients regarding diversity initiatives, we’ve found that IT departments are less aware of the value that diversity programs can play in their skills-sourcing efforts when compared to human resources or business leadership.

With the shortage of qualified IT workers likely to increase, organizations need to add diversity programs to their arsenal to address their hiring needs.

More of the CIO Insight slideshow from Dennis McCafferty


20
Jul 16

Continuity Central – Majority of organizations experience downtime and service degradation due to IT capacity issues

Super interesting research on the hidden troubles associated with IT capacity.

Sumerian has published the results of its latest research, in conjunction with analyst house Freeform Dynamics. The research revealed a genuine mismatch between the IT infrastructure that businesses have in place versus what they actually need , supporting the widely held view that there is significant overspend on server capacity across industries. Worryingly, it also revealed a total mismatch between the capacity management tools and processes currently in place versus those needed to deal with this issue.

Key highlights of the research include:

76 percent of IT professionals resort to overprovisioning IT infrastructure in order to avoid capacity related issues
‘Overprovision and forget’ remains the most common approach amongst IT professionals, with the vast majority relying heavily or partially on instinct and vigilance (90 percent), system alerts and alarms (86 percent), and a range of ad hoc tools and practices (73 percent), to manage capacity in a very reactive way. As a result, less than one in five (18 percent) rated their capacity planning practices for their overall IT system resources as ‘very effective’, with others admitting they were less than ideal (54 percent), or wholly inadequate (21 percent).

More of the Continuity Central article


19
Jul 16

Baseline – Cloud-First—Except When Performance Matters

Many companies have a cloud-first policy that hosts as many applications as possible in the cloud, but apps that are latency-sensitive are staying on premise.

In the name of achieving increased IT agility, many organizations have implemented a cloud-first policy that requires as many application workloads as possible to be hosted in the cloud. The thinking is that it will be faster to deploy and provision IT resources in the cloud.

While that’s true for most classes of workloads, those applications that are latency-sensitive are staying home to run on premise.

Speaking at a recent Hybrid Cloud Summit in New York, Tom Koukourdelis, senior director for cloud architecture at Nasdaq, said there are still whole classes of high-performance applications that need to run in real time. Trying to access those applications across a wide area network (WAN) simply isn’t feasible. The fact of the matter, he added, is that there is no such thing as a one-size-fits-all cloud computing environment.

More of the Baseline article from Mike Vizard


18
Jul 16

IT Business Edge – Confused by Digital Transformation? Welcome to the Club

If you feel that you’re falling behind in the race to digital transformation, take heart – you’re not alone. It turns out that a good chunk of enterprise leaders believe they are either coming up short in building the next-generation data environment or are unsure where they stand because the definition of success is too vague.

This should not come as a huge surprise, of course, since digital transformation is unlike technology developments of the past, primarily because it involves much more than technology. This time, the change reaches way beyond the data center and into the very heart of the business model, and the business culture, itself.

More of the IT Business Edge article from Arthur Cole


12
Jul 16

ZDNet – Cloud computing pushes enterprise vendors closer to their customers

Cloud computing may help make running enterprises a little bit easier (allegedly), but it has not made running an enterprise software business any easier. If anything, things have gotten more difficult for vendors lately.

The most challenging piece of the rapidly accelerating migration to cloud for enterprise software providers is delivering a superior customer experience.

That’s the gist of a recent analysis produced by Bain and Company, which points out that in the era of cloud connectivity, the era of shoddy releases and so-so customer service is coming to an end. “For many years, enterprise technology companies got along fine with pretty low customer experience ratings–just about the lowest, in fact, of the industries we measured,” the report’s authors, Chris Brahm, James Dixon and Rob Markey, state. But it never seemed to matter, they continue: “Once software or hardware was installed and running, companies were reluctant to go through the expense and hassle of changing vendors, even if the technology wasn’t delivering a superior experience.”

More of the ZDNet article from Joe McKendrick


11
Jul 16

CloudExpo Journal – The End Goal of Digital Transformation

Although we often write about and discuss digital transformation, we often fail to identify the end goal we are really trying to achieve. We talk at great length about data, analytics, speed, information logistics systems and personalized user experiences, but none of these are the end goal. Ultimately we must digitally transform so we can remove the “fog of war,” and have clear visibility and insights into our businesses and the needs of our customers. The end goal of digital transformation, however, is the ability to rapidly act and react to changing data, competitive conditions and strategies fast enough to succeed.

Knowledge is nothing, if not tied to action. In a recent survey of 500 managers, they reported the number one mistake companies are making in digital transformation is moving too slow. They may have all the necessary information and strategies, but if they are incapable of acting or reacting fast enough to matter, then it is wasted. True digital transformation includes the information logistics systems capable of collecting, analyzing and reporting data fast enough to be useful, plus the ability to act and react in response.

More of the CloudExpo Journal from Kevin Benedict


08
Jul 16

CIO Dashboard – A CIO’s Guide for Engaging the Board

Guest post by Paula Loop, Leader of PwC’s Governance Insights Center

New technologies from artificial intelligence and drones to 3-D printing, predictive analytics, and driverless cars are disrupting how companies compete and create value. US CEOs believe investing in technology is the most direct path to meaningful innovation and operational efficiency, but these new technologies are generating risks that Boards are scrambling to contain.

The average age of a director at a public company is 63 years old. The majority of public company directors aren’t sitting executives who work through technological advancements in their day jobs. Given the pace of technological change, how can boards really be on top of their game?

Directors recognize their dilemma. Nearly one-third of directors polled in our 2015 Annual Corporate Directors Survey say their board isn’t sufficiently or at all engaged in overseeing/understanding the company’s annual IT budget. Similarly, 33% say the company’s approach to IT strategy and risk mitigation doesn’t anticipate potential advantages from emerging technologies.

What’s the solution? Should we swap all sitting directors with millennials and the technologically savvy? Or should we push companies to prioritize IT awareness and devote elements of board meetings to IT education? The answer lies somewhere in between.

More of the CIO Dashboard article from Chris Curran


06
Jul 16

CIOInsight – The Heavy Cost of System Downtime

IT system outages have emerged as fairly routine issues for companies today—and the resulting downtime amounts to a five-figure financial hit every day, according to recent research from CloudEndure. The resulting “2016 Disaster Recovery Survey” report reveals that while the majority of IT professionals say they’ve set service availability goals of 99.9% (a.k.a., the industry standard “three nines” mark), far fewer say they’re capable of achieving this “most of the time.” As for the culprits? Either human error or network failures are usually to blame, not to mention app bugs, storage failures and (of course) the ever-troublesome hacker. Disaster recovery solutions would help. However, only a minority of businesses use disaster recovery for the majority of their servers.

More of the CIO Insight slideshow from Dennis McCafferty


05
Jul 16

SearchDataCenter – IT lifecycle management drives smarter refresh decisions

Here’s a nice look at the PC refresh model from a business perspective.

IT teams need to strike a balance between keeping up with the latest technology and being cost-effective. Proper IT lifecycle management techniques can help.

IT is a highly dynamic environment, with new products constantly coming to market. For organizations that want to have the best of everything, they must chase the market and accept the high cost of continuously replacing or updating their IT systems. Having the most up-to-date platform all the time doesn’t always work from a cost-benefit standpoint.

The majority of organizations follow one of two IT lifecycle management models for equipment refreshes. In the first model, organizations view equipment as having a nominal lifespan, and then replace it. If the piece of equipment fails during its lifespan, they either replace it with a similar specification to avoid retro-testing existing workloads against a new infrastructure or with a newer, more powerful and energy-efficient system.

More of the SearchDataCenter article from Clive Longbottom


30
Jun 16

Baseline – IT Struggles to Meet Network Capacity Demands

An insatiable need for access to data and digital technologies is causing organizations to expand their network capacity to staggeringly high levels, according to a recent survey from Viavi Solutions. The resulting “Ninth Annual State of the Network Global Study” indicates that most companies will soon be running the majority of their apps in the cloud, seeking to lower expenses while provisioning network resources more effectively. In addition, most enterprises are deploying some form of software-defined networking (SDN). At the same time, they’re investing in state-of-the art unified communications (UC) tools, including VoIP and Web collaboration apps—all of which are contributing to a need for more bandwidth. “Data networks of all types around the globe are being strained by an explosion of traffic, from bandwidth-hungry video today to the Internet of things tomorrow,” said Oleg Khaykin, president and CEO at Viavi Solutions.

More of the Baseline slideshow from Dennis McCafferty