05
Feb 16

The Register – After safe harbour: Navigating data sovereignty

Max Schrems has a lot to answer for. The Austrian is single-handedly responsible for bringing down a key transnational data agreement that has left cloud service providers scrabbling for legal counsel. This is either a good thing, if you’re a privacy activist concerned about intrusive US surveillance policies, or a confusing and worrying one, if you’re a provider or customer of cloud services.

Worried by the Edward Snowden revelations, Schrems questioned the Irish Data Protection Commissioner, on the basis that Facebook was collecting his data in Ireland and then moving it to the US for processing. The Irish DPC simply pointed to the Safe Harbour agreement and said that its hands were tied.

The case was bumped up to the Court of Justice of the European Union (CJEU), which on October 16 ruled that Safe Harbour was illegal. Its rationale was that it enabled companies to share data for national security purposes but didn’t address whether the protections were strong enough.

More of The Register article from Danny Bradbury


03
Feb 16

Baseline – Why IT Pros Give Tech Transformation a Weak Grade

Few front-line technology workers give their companies high marks for adapting to new, transformative tech, according to a recent survey from Business Performance Innovation (BPI) and Dimension Data. The resulting report, “Bringing Dexterity to IT Complexity: What’s Helping or Hindering IT Tech Professionals,” indicates that most organizations haven’t even begun to transform IT—or are just getting started. A major sore spot: A lack of collaboration and/or alignment with the business side, as most tech staffers said business teams wait too long to bring IT into critical planning processes. This, combined with a lack of funding and other resources, results in tech departments spending too much time on legacy maintenance and far too little on essential advances that bring value to the business. “Instead of ushering their companies into a new age of highly agile innovation, IT workers are hindered by a growing list of maintenance tasks, staff cutbacks and aging infrastructure,” according to the report.

More of the Baseline Magazine article from Dennis McCafferty


02
Feb 16

Arthur Cole – Weighing the Pros and Cons of Commodity Infrastructure

Data infrastructure built on commodity hardware has a lot going for it: lower costs, higher flexibility, and the ability to rapidly scale to meet fluctuating workloads. But simply swapping out proprietary platforms for customizable software architectures is not the end of the story. A lot more must be considered before we even get close to the open, dynamic data environments that most organizations are striving for.

The leading example of commodity infrastructure is Facebook, which recently unveiled plans for yet another massive data center in Europe – this time in Clonee, Ireland. The facility will utilize the company’s Open Compute Project framework, which relies on advanced software architectures atop low-cost commodity hardware and is now available to the enterprise community at large in the form of a series of reference architectures that are free for the asking. The idea is that garden variety enterprises and cloud providers will build their own modular infrastructure to support the kinds of abstract, software-defined environments needed for Big Data, the Internet of Things and other emerging initiatives.

More of the IT Business Edge post from Arthur Cole


27
Jan 16

Continuity Central – Six tips for successful IT continuity

Andrew Stuart offers some IT-focused experience-based business continuity tips :

1. Understand the threat landscape

Storms, ransomware viruses and fires are only some of many real threats for which all businesses should proactively prepare. Your IT department needs a full understanding of all of the threats likely to hit your building, communications room or servers in order to help prepare for the worst. This can be done by assessing risks based on the location and accessibility of your data centres / centers, as well as any malicious attacks that could occur. When planning to mitigate a disaster, treat every incident as unique: a local fire may affect one machine, whereas human error may lead to the deletion of entire servers.

2. Set goals for recovery

While some companies assume they are protected in the wake of a disaster if they duplicate their data, many learn the hard way that their backup stopped functioning during a disaster or their data is inaccessible afterwards. The IT team needs to define criteria for recovery time objectives (RTO), or how long your business can continue to run without access to your data, and recovery point objectives (RPO), which is the maximum age of data that will still be useful to back up. The IT team will also need to identify critical systems and prioritise recovery tasks.

More of the Continuity Central article from Andrew Stuart


26
Jan 16

CustomerThink – We’ve always done it this way…

Does inertia matter more to you than delivering better services to your clients?

Something happened the other day that reminded me of a time when I was still new in this business. Yes, I didn’t have the 25+ years of experience as I do now but I still had enough under my belt to know what was going on and how to evaluate a department and its staff.

I had just started working as a banquet manager at another hotel and found that most of the waiters have been working there for around 7 years and some up to 15 years. After a few days of observation, I made a list of the things that I knew we can do better and planned the steps needed to make it happen. No big deal, I’ve done this many times before.

On the following week’s schedule I listed a date for a mandatory meeting/training class and prepared the topics I would discuss. The meeting day arrived and we all sat around a series of round tables and enjoyed the coffee, soda and bottled water I had prepared for them. Hey, if I force you to come in for a meeting, the least I can do is have some beverages prepared for you…right?

More of the CustomerThink post from Steve DiGioia


25
Jan 16

CloudExpo blog – Cloud and Shadow IT – An Inevitable Pairing?

You can’t seem to have a conversation about cloud technology and its impact on the business without the topic of Shadow IT coming up. The two concepts at times seem so tightly intertwined, one would think there is a certain inevitability, almost a causal linkage between them. Shadow IT tends to be an emotional topic for many, dividing people into one of two camps. One camp tends to see Shadow IT as a great evil putting companies, their data and systems at risk by implementing solutions without oversight or governance. Another camp sees Shadow IT as the great innovators that are helping the company succeed by allowing the business to bypass a slow and stagnant IT organization. Does going to the cloud inherently mean there will be Shadow IT? If it does, is that necessarily a bad or good thing?

More of the CloudExpo blog post by Ed Featherston


22
Jan 16

About Virtualization – The Network is Agile, but Slower with SDN and Microservices

Have you ever moved something in your kitchen because it fits better, only to find that you spend more time having to go and get it where you used to have it closer at hand? This is a simple analogy, but does relate to some confusion that is happening around SDN and microservices implementations.

As new methodologies and technologies come into your organization, we assess what it is that they are meant to achieve. You’ve worked out a list of requirements that you want to see, and from that wish list, you check off which are attained by the product of choice. As we look towards microservices architectures, which I fully agree we should, we have one checklist for the applications. As we look at the challenges that SDN solves, which I fully agree that we should, we have another checklist.

Let’s first approach this by dealing with a couple of myths about SDN and microservices architectures:

More of the About Virtualization post


21
Jan 16

Formtek – Cloud Computing: While Security Remains Biggest Concern, Security Tops List as Reason to Implement Cloud

A new study on cloud computing use by small and medium sized companies from Exact and Pb7 Research finds two surprising results.

The first has to do with security. Typically security is cited as the number one concern for why businesses avoid the cloud; but surprisingly, the Exact report found that security is also the number one reason cited by cloud adopters for choosing the cloud. It seems like some sort of love/hate relationship. The second interesting result is that, in general, businesses using the cloud are better off financially compared to peer businesses that aren’t using the cloud.

More of the Formtek post from Dick Weisinger


20
Jan 16

Data Center Knowledge – Data Centers as a Competitive Tool in Today’s Business Landscape

Twenty years ago, data centers were looked at through a Wizard of Oz tinted lens. They were a big, powerful and expensive means for data storage, but few business stakeholders outside the IT department really understood their impact – or knew what was going on behind the curtain. The digital revolution flipped this reality on its head. Today, data centers are no longer bulky cost centers, but drivers of business, enabling the data processing and availability modern enterprises need to maintain continuity and gain competitive advantage.

The Importance of Data

Data is everywhere: it is created by nearly everything – tollbooths, online transactions, instant messaging, telephone calls – and it has become earth’s most abundant digital resource. In fact, every day, we create 2.5 quintillion bytes of data. As a result, data has transformed into businesses’ greatest asset and competitive differentiator.

More of the Data Center Knowledge post from Russell Senesac


15
Jan 16

TechCrunch – The Cloud’s Biggest Threat Are Data Sovereignty Laws

The beauty of the cloud is the promise of simplification and standardization — without regard to physical or geographic boundaries. It’s this “any time, any place, any device” flexibility that is driving rapid adoption.

However, new government regulations on data sovereignty threaten to complicate the delivery model that has made cloud computing attractive, presenting new concerns for companies with operations in multiple countries.

While the strike down this fall of the United States-European Union “Safe Harbor” agreement made most of the headlines, I see the recent localization law in Russia (which went into effect in September) as a more significant development. The law mandates that personal data on Russian citizens must be stored in databases physically located within the country itself.

With this ruling, companies that capture, use and store data must abide by specific laws or face the consequences of falling out of compliance. Russia is a warning bell. With currently 20+ countries also considering similar privacy laws, the landscape will grow increasingly complex for cloud providers, and more costly for customers, thus chipping away at the beauty of the cloud.

More of the TechCrunch post