When the character Maverick from the movie Top Gun exclaimed, "I feel the
need, the need for speed", you'd be forgiven for mistaking it for a sound
bite from a CIO discussing their transactional databases. Whether it's a
financial organization predicting share prices, a bank knowing whether it can
approve a loan or a marketing organisation reaching consumers with a
compelling promotional offer, the need to access, store, process and analyze
data as quickly as possible is an imperative for any business looking to gain
a competitive edge. Hence when in 2011, SAP announced their new in-memory
platform HANA for enterprise applications everyone took note as they coined
the advantage of real-time analytics. SAP HANA promised to not just make
databases dramatically faster like traditional business warehouse accelerator
systems but instead speed up the front end, enabling c... (more)
Today we will be talking about VCE's cloud infrastructure product, the
Vblock. Gartner's recent study that through next year 60% of enterprises will
embrace some form of cloud adoption, has enlightened the competitive cloud
vendor market. But at the same time, does the cloud industry need to be
driven by vendor competition or vendor collaboration? Archie Hendryx of VCE
Technology Solutions discusses this very matter.
EM360°: Could you tell us about VCE and why cloud has played a big part in
your company's solutions?
Archie: VCE is a unique start up company formed via joint investments from
EMC, Cisco, VMware and Intel that has been operating for just over three
years. Its focus is solely on building the world's most advanced converged
infrastructure, the Vblock. The Vblock is a pretested, prevalidated and
preconfigured and more importantly pre-integrated infrastruct... (more)
For most organizations application releases are analogous to extremely tense
and pressurized situations where risk mitigation and tight time deadlines are
key. This is made worse with the complication of internal silos and the
consequent lack of cohesion that exists not just within the microcosm of IT
infrastructure teams but also amongst the broader departments of development,
QA and operations. Now with the increasing demand on IT from application and
business unit stakeholders for new releases to be deployed quickly and
successfully, the interdependence of software development and IT operations
are being seen as an integral part to the successful delivery of IT services.
Consequently businesses are recognizing that this can't be achieved unless
the traditional methodologies and silos are readdressed or changed. Cue the
emergence of a new methodology that's simply... (more)
The term "software defined" has taken many forms in recent months from
Software Defined Datacenter (SDDC), Software Defined Infrastructure (SDI) to
even component vendors adopting the tagline to exalt their own agenda with
Software Defined Networking (SDN) and Software Defined Storage (SDS). Yet
ironically the majority of the vendors adopting the tagline are also dealing
with infrastructure product lines that a "software defined" approach is
aiming to make irrelevant.
The emergence of the cloud illuminated to the industry that the procurement,
design and deployment of the infrastructure components of network, storage
and compute were a hindrance to application delivery. The inability for
infrastructure components to not be quickly and successfully coordinated
together as well as be automatically responsive to application needs has led
many to question why tradition... (more)
First and foremost you can't have a successful software-defined model if your
team still have a hardware-defined mentality. Change is inevitable and
whether it's embraced or not it will happen. For experienced CIOs this is not
the first time they've experienced this technological and consequently
cultural change in IT.
Question 1. Vendors are racing to lead the movement towards a
software‐defined data centre. Where are we up to in this journey, and how
far are we from seeing this trend widely adopted?
Considering most organizations have still not fully virtualized or moved
towards a true Private Cloud model, SDDC is still in its infancy in terms of
mainstream adoption and certainly won't be an overnight process. While
typical early adopters are advancing quickly down the software-defined route
these are mostly organizations with large scale multi-site data centers w... (more)
Archie Hendryx looks at the benefits and challenges of managing multiple IT
components through a single support solution
Many public sector organisations are not changing their ICT legacy systems.
As a result, they face increasing number of inefficiencies and challenges.
These issues can nevertheless be overcome, and performance increased, by
managing these legacy systems within a converged infrastructural environment.
This approach can also reduce the risks associated with legacy systems, such
as security, missing functionality, increased complexity and operational
Converged infrastructure simplifies support as it packages multiple IT
components from different vendors into one single, optimised computing
solution. So it offers end-to-end support unlike reference architectures,
which require organisations to deal with a multitude of disparate vendors t... (more)
Virtualization and cloud computing certainly get a lot of hype. But how is
that being translated into actual usage by enterprises?
A recent survey looked into just how deep the penetration really is, as well
as what functions companies were virtualizing and sending to the cloud.
More than 200 organizations were questioned by four companies - Virtual
Instruments, Altor Networks, Juniper Networks and Sendmail. These surveys
covered several areas, while focusing on virtualization and the cloud.
"Our results show companies are increasingly motivated to virtualize
applications for cost savings and business agility reasons," said Len
Rosenthal, vice-president of marketing at Virtual Instruments.
"But, even with all the benefits of virtualization, performance concerns
continue to be a significant stumbling block to virtualizing business
critical applications, which are inher... (more)
VP of Operations concludes, "I get it! We've been investing in hardware worth
more than a million pounds to bring the performance latency of our
Datawarehouse down by 10 milliseconds when all the time I just needed to
replace a cable!"
In this last week of customer visits I was astounded to have the above
sentence said to me on two separate occasions by two separate companies. What
really took me aback was that this is exactly how I felt prior to joining
Virtual Instruments as a Solutions Consultant six months ago. It’s a bold
claim but one I will certainly stand by and challenge to prove to
anyone (feel free to send me a PM on LinkedIn).
For example at a recent POV engagement with a VP of Operations, we
demonstrated how Virtual Instruments’ solution had pinpointed problems with
ports that were connected to their critical Datawarehouse environment as well
as their ... (more)
On a recent excursion to a tech event I had the pleasure of meeting a
well-known ‘VM Guru', (who shall remain nameless). Having read some of this
individual's material I was excited and intrigued to know his thoughts on how
he was tackling the Storage challenges related to VMware especially with
Fibre Channel SANs.
"Storage, that's nothing to do with me, I'm a VirtGuy", he proudly announced.
To which I retorted, "yes but if there are physical layer issues in your SAN
fabric, or poorly configured Storage etc. it will affect the performance of
your Virtual Machines and their applications, hence surely you also need some
visibility and understanding beyond your Server's HBAs?"
Seemingly annoyed with the question, he answered, "Why? I have SAN architects
and a Storage team for that, it's not my problem. I told you I'm a VirtGuy, I
have my tools so I can check esxtop, vCe... (more)
Roll back several years and certain vendors had you believe that Fibre
Channel was dead and that the future would be iSCSI. A few years later and
certain vendors were then declaring that Fibre Channel was dead again and
that the future was FCoE. So while this article is not a iSCSI vs FC or FC vs
FCoE comparison list (there's plenty of good ones out there and both iSCSI or
FCoE each have immense merit), the point being made here is that Fibre
Channel unlike Elvis really is alive and well. Moreover Fibre Channel still
remains the protocol of choice for most mission-critical applications despite
the FUD that surrounds its cost, manageability and future existence.
Most storage folk who run enterprise class infrastructures are advocates of
Fibre Channel not only because of its high performance connectivity
infrastructure but also due to its reliability, security and scal... (more)
Yearly prediction blogs are so clichéd hence why I've always tried to avoid
writing one. Despite this I've always made a mental note of technology,
products or companies that I thought were going to really do well in the
upcoming year. Back in 2008 I felt VMware were going to really take off after
the release of 3.5. In 2009 I had a gut feeling DataDomain would explode just
before they were bought by EMC. In 2010 I spoke to a friend about how 3PAR's
technology could no longer be ignored and in 2011 I still wasn't convinced
that FCoE would overtake FC in revenue despite all the analysts' claims. But
why believe me when I'd never put these thoughts on paper? So now at the
beginning of 2012, I've decided to put my money where my mouth is, pull out
my crystal ball and document my predictions.
First off I'm going with VCE's Vblock and their new FastPath feature. VCE