Feeds:
Posts
Comments

Posts Tagged ‘Gartner’

You’re not really sure how it happened, but some time between last year and the summer of 2011 you were suddenly facing a big data problem, or you were being told you were facing a big data problem, or more accurately you were being told that you needed a big data solution.

Funny thing was that you hadn’t really done anything drastic over the last couple of years that would seem to indicate a tsunami of data was about to breach your storage floodgates, but then again it wasn’t like you watched yourself going bald either.

(more…)

Read Full Post »

“Information is not knowledge” – Albert Einstein

I recently read a couple of posts about BigData from my friend Chris Hoff – “Infosec Fail: The Problem With BigData is Little Data” and “More on Security and BigData…Where Data Analytics and Security Collide”

In these posts Hoff posits that the mass centralization of information will benefit the industry and that monitoring tools will experience a boon, especially those that leverage a cloud-computing architecture…

This will bring about a resurgence of DLP and monitoring tools using a variety of deployment methodologies via virtualization and cloud that was at first seen as a hinderance but will now be an incredible boon.

As Big Data and the databases/datastores it lives in interact with then proliferation of PaaS and SaaS offers, we have an opportunity to explore better ways of dealing with these problems — this is the benefit of mass centralization of information.

Hoff then goes on to describe how new data warehousing and analytics technologies, such as Hadoop, would positively impact the industry…

Even when we do start to be able to integrate and correlate event, configuration, vulnerability or logging data, it’s very IT-centric.  It’s very INFRASTRUCTURE-centric.  It doesn’t really include much value about the actual information in use/transit or the implication of how it’s being consumed or related to.

This is where using Big Data and collective pools of sourced “puddles” as part of a larger data “lake” and then mining it using toolsets such as Hadoop come into play…

(more…)

Read Full Post »

From Computer World UK (here)

Black Friday and Cyber Monday have come and gone. Now it’s time for Amrit Wednesday, or Thursday, or Friday—oh, whatever—to pay our industry back for all the dubious cheer it spread in 2009. Believe me, when it comes to this list, it’s much better to give than receive. Here goes:

(more…)

Read Full Post »

Gartner Magic Quadrant

A storm is brewing throughout the analyst community as one of the largest and most influential technology analyst firms comes under fire for one of their highest prized research artifacts – The Gartner Magic Quadrant (MQ) – ZL Technologies has filed a lawsuit alleging damages from Gartner’s Email and Archiving MQ and the MQ process as a whole, in which ZL has been positioned as a Niche player since 2005.

From ZL technologies website (here)…

ZL Technologies, a San Jose-based IT company specializing in cutting-edge enterprise software solutions for e-mail and file archiving, is challenging Gartner Group and the legitimacy of Gartner’s “Magic Quadrant.” In a complaint filed on May 29, 2009, ZL claims that Gartner’s use of their proprietary “Magic Quadrant” is misleading and favors large vendors with large sales and marketing budgets over smaller innovators such as ZL that have developed higher performing products.

The complaint alleges: defamation; trade libel; false advertising; unfair competition; and negligent interference with prospective economic advantage.

For those unfamiliar with analysts, Gartner and the Magic Quadrant let me provide a quick overview:

(more…)

Read Full Post »

btp2

Not too long ago I embarked on a creating a podcast series that would provide more regularity than the blog. Beyond the Perimeter has been a tremendous amount of fun and as we just posted our 50th podcast I wanted to reflect on some of the highlights and wonderful guests we have been honored to have joined us.

Beyond the Perimeter iTunes subscription

Beyond the Perimeter Direct XML Feed

(more…)

Read Full Post »

The matrix

Consolidation is the major benefit or “killer app” for server/data center virtualization. Standardization is the major benefit or “killer app” for client-side virtualization.

As I was pondering the challenges of current systems management processes, researching the latest and greatest from the client-side virtualization vendors, and talking to a lot of large organizations I was trying to find that one thing that explained the operational benefits of client-side virtualization. There are more than one, but it really does come down to standardization, allow me to explain… (more…)

Read Full Post »

VDI fail

To address the increasing cost and complexity of managing dynamic IT environments organizations are trying to understand how to adopt virtualization technologies. The value proposition and “killer app” are quite clear in the data center, however less attention has been given to the opportunities for endpoint virtualization. Even though there are multiple methods to address client-side virtualization; hosted virtual desktops (HVD), bare-metal hypervisors, local and streaming virtual workspaces and a range of options that layer on top of and between them all, such as application virtualization, portable personalities, and virtual composite desktops, there is still a tremendous amount of confusion and even more misconceptions about the benefits of client-side virtualization than with server virtualization. The major architectural flaw in almost all of these solutions is they remain very back end and infrastructural heavy, which reduces the benefit of cost-reduction and lower complexity.

Unlike server virtualization, which drove adoption from the bottom up, that is from the hypervisor and then through the other stacks, adoption of endpoint virtualization technologies is moving top down, that is starting with single applications within an existing OS. Application virtualization adoption will accelerate over the next 12-18 months with Gartner life cycle management analyst suggesting that it will be included in the majority of PC life cycle RFP’s in 2010 and beyond. Workspace/Desktop virtualization will follow over the next 24-36 months, as will the endpoint virtualization infrastructures. The adoption of both workspace/desktop and endpoint virtualization infrastructure will align with organizations desktop refresh cycles. Considering the average is between 3-5 years and considering that many are looking at desktop refresh to support Vista, although it probably only has about a 10% market adoption, and Windows 7, it is conceivable that we will begin seeing accelerated adoption of desktop and infrastructure virtualization over the next 24-36 months as organizations rethink their current systems management processes and technologies.

Let’s look at the 4 client/desktop virtualization models I believe will become the most prevalent over the next 3-5 years… (more…)

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 40 other followers