28
Feb 17

TheWHIR – 3 Steps to Ensure Cloud Stability in 2017

We’re reaching a point of maturity when it comes to cloud computing. Organizations are solidifying their cloud use-cases, understanding how cloud impacts their business, and are building entire IT models around the capabilities of cloud.

Cloud growth will only continue; Gartner recently said that more than $1 trillion in IT spending will, directly or indirectly, be affected by the shift to cloud during the next five years.

“Cloud-first strategies are the foundation for staying relevant in a fast-paced world,” said Ed Anderson, research vice president at Gartner. “The market for cloud services has grown to such an extent that it is now a notable percentage of total IT spending, helping to create a new generation of start-ups and ‘born in the cloud’ providers.”

More of TheWHIR post from Bill Kleyman


16
Feb 17

Continuity Central – Report looks at the prevalence of business-critical custom applications

At the 8th Annual Cloud Security Alliance (CSA) Summit at RSA in San Francisco, Skyhigh Networks unveiled its ‘Custom Applications and IaaS Report 2017’ report.

Conducted in partnership with the CSA, the report is based on a broad survey of software development, IT administration, IT security, operations and devops professionals across the Americas, EMEA and Asa Pacific, involved in developing, deploying and securing custom applications. While respondents forecast rapid IaaS adoption, they at the same time expressed numerous unresolved concerns about the security and compliance of their custom applications in IaaS platforms.

“Custom applications are a core part of how our business operates, and moving these to the cloud provide IT an opportunity to ‘start fresh’ with the right visibility, controls and overall security, without getting in the way of business operations,” said Stephen Ward, CISO, TIAA. “Meeting our security requirements for our applications, as well as our IaaS environment, is absolutely critical to accomplishing our business goals for cloud and overall software programs.”

Some of the key findings from the survey include:

Every company is a software company. Every company has developers writing custom code to improve engagement with employees, partners and customers.

More of the Continuity Central post


10
Feb 17

SearchCloudComputing – For enterprises, multicloud strategy remains a siloed approach

Although not mentioned in this article, enterprise cloud providers like Expedient are often a key player in the multicloud mix. Enterprise clouds deliver VMware or HyperV environments that require little or no retraining for the infrastructure staff.

Enterprises need a multicloud strategy to juggle AWS, Azure and Google Cloud Platform, but the long-held promise of portability remains more dream than reality.

Most enterprises utilize more than one of the hyperscale cloud providers, but “multicloud” remains a partitioned approach for corporate IT.

Amazon Web Services (AWS) continues to dominate the public cloud infrastructure market it essentially created a decade ago, but other platforms, especially Microsoft Azure, gained a foothold inside enterprises, too. As a result, companies must balance management of the disparate environments with questions of how deep to go on a single platform, all while the notion of connectivity of resources across clouds remains more theoretical than practical.

Similar to hybrid cloud before it, multicloud has an amorphous definition among IT pros as various stakeholders glom on to the latest buzzword to position themselves as relevant players. It has come to encompass everything from the use of multiple infrastructure as a service (IaaS) clouds, both public and private, to public IaaS alongside platform as a service (PaaS) and software as a service (SaaS).

More of the SearchCloudComputing article


09
Feb 17

CIO Insight – Deep Insecurities: Things Just Keep Getting Worse

Ninety-three percent of companies’ security operation centers admit they’re not keeping up with the volume of threat alerts and incidents, putting them at risk.

Cyber-threats
Despite a growing focus on cyber-security—along with gobs of money and staff time thrown at the task—things just seem to get worse. According to a December 2016 report from McAfee Labs, 93 percent of organizations’ security operation centers admit that they are not keeping up with the volume of threat alerts and incidents, putting them at significant risk of moderate to severe breaches.

Altogether, 67 percent of the survey respondents (more than 400 security practitioners spanning multiple countries, industries and company sizes) reported an increase in security breaches. Yet, on average, organizations are unable to sufficiently investigate 25 percent of security alerts.

More of the CIO Insight article from Samuel Greengard


02
Jan 17

Informatica – What is an Enterprise Architecture Maturity Model?

Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. The solution to this predicament is an Enterprise Architecture (EA) process that can provide a framework for an optimized IT portfolio. IT Optimization strategy should be based on a comprehensive set of architectural principles which ensure consistency and make IT more responsive, efficient, and economical.
The rationalization, standardization, and consolidation process helps organizations understand their current EA maturity level and move forward on the appropriate roadmap. As they undertake the IT Optimization journey, the IT architecture matures through several stages, leveraging IT Optimization Architecture Principles to attain each level of maturity.

Multiple Levels of Enterprise Architecture Maturity Model

Level 1: The first step involves helping a company develop its architecture vision and operating model, with attention to cost, globalization, investiture, or whatever is driving the company strategically.

More of the Informatica post


30
Dec 16

GigaOM – The enterprise CIO is moving to a consumption-first paradigm

Take yourself back a couple of decades and the IT industry looked very different than it does today. Back then the number of solution choices was relatively limited and only available to those with the finances to afford it. Many of the core services had to be built from the ground up. Why? There simply wasn’t the volume or maturity of the IT marketplace for core services. Today, that picture is very different!

For example, consider email. Back in 1995, Microsoft Exchange was just a fledgling product that was less than two years old. The dominant email solutions were cc:Mail (acquired by Lotus in 1991), Lotus Notes (acquired by IBM in 1995) along with a myriad of mainframe, mini and UNIX-based mail servers.

Every enterprise had to setup and manage their individual email environment. Solutions like Google Apps and Microsoft 365 simply did not exist. There was no real alternative…except for outsourcing.

More of the GigaOM post from Tim Crawford


29
Dec 16

Information Age – IT can make or break the success of a deal

This article is a couple of years old, but the topic of IT agility is more important than ever in the merger and acquisition space.

Businesses are increasingly under pressure to deliver value to stakeholders, particularly when undertaking bold initiatives such as mergers, acquisitions or asset disposals. This is the case not only for corporate acquirers but also for private equity (PE) firms, whose strategy is leaning toward add on acquisitions as a means of growing their portfolio companies.

Among the fundamental change brought on by mergers and acquisitions (M&A), management teams often require significant effort in restructuring or streamlining operations of acquired businesses to deliver success in the absence of financial engineering. But given the challenging success rate of M&A activity delivering realised value to organisations in the short and medium term how can those parties involve in M&A actually deliver realised value?

More of the Information Age article from Tony Qui


22
Dec 16

ZDNet – If you want to be secure, get in the cloud

Some CIOs are reticent to rely on the cloud. The high cost of a data loss means executives decide to keep information within the enterprise firewall. However, a change in stance is taking place – and many business leaders recognise the cloud is actually a better way to keep information safe and lawmakers in check.

CIOs in all territories face a tranche of data rules. Businesses are currently preparing for another change in legislation. The European Union’s General Data Protection Regulation (GDPR) is due to come into force on 25 May 2018 and will see companies fined up to 4 per cent of their global turnover for breaches.

GDPR will require a serious step up in security policies and procedures. The potential costs, both in terms of financial and reputational damage, could leave executives out of pocket, out of a job – or even more seriously – in jail.

However, evidence suggests a wake up call is required. Research from insurance specialist Lloyd’s suggests 92 per cent of companies have suffered a data breach in the past five years. Executives must react and take a proactive approach to information security.

More of the ZDNet post from Mark Samuels


19
Dec 16

Data Center Knowledge – TSO Logic: Cloud Migration Offers Instant Savings

Need help doing the math to see if your in-house virtual machines would be cheaper to operate in the cloud? If so, contact me.

Nearly half (45 percent) of on-premise virtualized operating system instances could run more economically in the cloud, for a 43 percent annual savings, according to research released this week by infrastructure optimization company TSO Logic. The research makes starkly clear the cost of legacy hardware, and the savings potential of cloud migration.

More than one in four OS instances are over-provisioned, the company says, and migrating them to an appropriate sized cloud instance would reduce their cost by 36 percent.

Drawn from an algorithmic analysis of anonymized data from TSO Logic’s North American customers, the research also showed that of 10,000 physical servers, 25 percent are at least 3-years-old. The same workload as done on Generation-5 servers could now be done on 30 percent less Generation-9 servers, based only on processor gains, the company says.

More of the Data Center Knowledge post from Chris Burt


16
Dec 16

Arthur Cole – What’s Up with Digital Transformation?

Practical article from Arthur Cole on how digital transformation is different than what we’ve been doing for 40 years in IT.

Some enterprise executives may rightly be confused by the whole concept of “digital transformation.” Digital technology has been a common facet of the enterprise for decades, so what exactly is being transformed?

In a nutshell, the difference this time is that rather than using digital technology to support and streamline existing business processes like sales, support and customer relations, the processes themselves are becoming digitized into services. So instead of using data infrastructure to make it easier to sell, say, cars, a digitally transformed business incorporates service-level functionality into the car itself, along with all the support systems that contribute to the manufacture, sale and ongoing support of the car.

More of the IT Business Edge post from Arthur Cole